Kinect MoCap Animator

Kinect MoCap Animator is a simple motion capturing tool that captures user motion from a body recording and saves it into a fbx animation clip. The recorded animation may be retargeted later to other humanoid models in your Unity projects, or edited with external tools like Autodesk’s ‘Maya’ or ‘3ds Max’.

Please note, the MoCap Animator works with body recordings only. These body recordings can be created with the help of “Azure Kinect Examples for Unity” or “Kinect-v2 with MS-SDK”-assets.

‘Kinect MoCap Animator’ is a sensor independent tool. It doesn’t work with live sensor data, but instead with body recordings only. These recordings can be created with the help of other Unity assets, like “Azure Kinect Examples for Unity” or “Kinect-v2 with MS-SDK” v2.20 or later. This package can be used on Windows platform with all versions of the Unity Editor – Free, Plus or Pro.

How to run the ‘Kinect Mocap Animator’:
1. Create a body recording with the recorder demo scene in the K4A-asset or K2-asset.
2. Import this package into a new Unity project.
3. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’.
4. Copy the previously created body recording file to the root folder of the ‘Kinect MoCap Animator’ project.
5. Open and run the KinectMocapAnimator-scene.
6. Change the name of the body recording file, or the user index, if needed.
7. Press ‘Start MoCap’ to start capturing the body motion of the selected user from the beginning.
8. After the MoCap Animator finishes, you can find the recorded animation in the ‘Recorded’-model under ‘KinectMocapFbx/Models’-folder.

For more information, see the Readme-file in the package, the online documentation here, or the online documentation as pdf.

How to retarget the recorded animation to other models:
You don’t need to replace the models in the Mocap Animator with your own, to utilize the recorded animation clips. Instead, you can utilize the retargeting capabilities of Unity’s animation system Mecanim. To do it, please follow these steps:

1. Copy the model with the recorded animation clip (by default this is: Assets/Models/Recorded.fbx) to the Unity project, where you plan to utilize the animation. Feel free to rename it, as you like.
2. Select the copied model in the Unity editor. Then switch to the Rig-tab of its Import settings, and make it Humanoid.
3. Go to the Animations-tab of the Import settings, to see the animations. Feel free to make changes, like setting the Start & End frames, Loop time, etc. Don’t forget to apply these changes.
4. Open the scene where your model should play the animation clip, and select it in Hierarchy. Make sure its Rig in its Import settings is Humanoid, as well.
5. Please also make sure the model has an Animator-component, and there is animator controller assigned to it. If there isn’t one, please create a new animator controller in the Assets-window, and assign it to the Controller-setting of the Animator-component.
6. Open the Animator-window (menu Window / Animation / Animator).
7. Drag the recorded animation clip (or the entire model) and drop it into the Animator window. The recorded animation will show up in the Animator’s state machine.

* The ‘Kinect Mocap Animator’-package can be purchased and downloaded in the Unity Asset store. All updates are and will be available to all customers, free of any charge.

Free for education:
The package is free for academic use. If you are a student, lecturer or university researcher, please e-mail me to get a free copy of the K4A-asset directly from me.

One request:
Please don’t share this package or its demo scenes in source form with others, or as part of public repositories, without my explicit consent.

* The online documentation of the ‘Kinect Mocap Animator’-package is available here or as a pdf-document.

Output File Formats:
To get the output file in the format you need, change the ‘Output file format’-setting of KinectFbxRecorder (component of the KinectController game object in the scene). Don’t forget to change the extension of the saved file in ‘Save file path’-setting, if needed. Keep in mind that in this case you may not see properly the results of the mo-cap in the scene. Here are the available output file formats:

  • FBX binary – saves fbx-file in binary format;
  • FBX ascii – saves fbx-file in text format;
  • FBX 6.0 binary – saves fbx-file v6.0 in binary format. This is the by-default setting;
  • FBX 6.0 ascii – saves fbx-file v6.0 in text format;

What’s new in version 2.0:
1. Updated the Mocap-animator to use only body recordings, created by the K4A- and K2-assets.
2. Updated the user interface of the KinectMocapAnimator-scene.
3. Cleaned up code. Upgraded to Kinect scripts to v1.15 and FBX wrapper to v2019.0.
4. Upgraded to Unity Editor v2019.1.0f2.


88 thoughts on “Kinect MoCap Animator

  1. Pingback: Motion Capture with Kinect-v2 Mocap Animatorphp Technologies | php Technologies

    • If you mean the Mocap Animator, it is fairly simple to use. Just see the ‘Readme-Kinect2-Mocap-Animator.pdf’ in the package. If you have any expert questions, please ask Ricardo Salazar, who posts his works here from time to time. He’s the best expert in design and animation I know so far.

  2. please send me your email. I will assets but what is the software I need to use? Are there lessons actually program using Visual studios and the programming aspects? Can you help me please? I want them to build a game or learn how to use the capture.

    • My e-mail is on the ‘About’-page. The software you need to use is Unity-3D. The output FBX is a 3D-scene exchange format, commonly used to transfer 3d geometry and animations between software packages like Unity, Maya, 3dMax, Blender and the likes.

  3. Hi
    I just bought your Mocap editor yesterday on asset store. It is perfect and solved my problem that I was struggling. I saw you are advertising the Update version(1.1). Could you please suggest where can I download? and do I need to pay for the update?

    • Hi, I’m not sure what advertising you mean. If you bought it yesterday, you should already have v1.1. You could check the version in What’s-New pdf-file in the package. All further updates are free of charge and can be downloaded from the Unity asset store. These applies to the other Kinect-v2 assets, too.

    • Hi, to make HD face-model data available in the scene, you only need to enable the ‘Get face model data’-setting of the FacetrackingManager (component of the KinectController game object). But how this data will be used, exported or added to the fbx-file is up to you. There are some use-cases, demonstrated in the face-tracking-demos of the K2-asset.

      • Thanks for the tip.

        I couldn’t find face-tracking demos in either KinectScript/Samples/ nor in StandardAssets/Microsoft/Kinect/Face/.

        In order to add head rotation data when saving to a new FBX file, it seems that one must change KinectMocapFbx/Scripts/KinectFbxRecorder.cs::Update() or SaveAnimFrame(). Is that correct?

      • Rumen said:
        > I meant you can use the public API of FacetrackingManager.cs to get the face data. It is a component of KinectController-game object in MocapFbxAnimator-scene. But I’m not sure what you can do with this data afterwards.

        I want to record to fbx a humanoid animation that involves both head rotations and skeletal movement (e.g. shrugging while nodding no). During cleanup, I plan to mask-off the lower torso and legs. I want the fbx to have as little model data as possible, so it’s easy to map to different models and keep the Unity build as small as possible (for WebGL).

        I’ve looked at the package contents at!/content/18708 but it’s not clear that any of them show how to do what I want. Would you clarify?

      • > “I want to record to fbx a humanoid animation that involves both head rotations and skeletal movement.”

        Isn’t it the way you want it at the moment? For getting full head orientation, make sure the FacetrackingManager-component is enabled.

        > “I plan to mask-off the lower torso and legs.”

        You would need to modify the code of Update()-method in KinectFbxRecorder.cs, to save only the joints you want. The full source is there, so you’re free to optimize the saved data as you want it.

        > “I’ve looked at the package contents at!/content/18708 but it’s not clear that any of them show how to do what I want. Would you clarify?”

        Forget about it. It’s not related to saving orientations to fbx animations.

      • Yes, when ‘Get Face Model Data’ is selected, I can capture some head rotations.

        However, I’ve done something more and recordings are no longer previewable. In Hierarchy > UChar0 > Inspector, I renamed UChar0 to LiveReplay, and renamed UChar1 to Recorded. When I record, UChar1.fbx is overwritten instead of creating Recorded.fbx. Do you support renaming these two game objects?

        Also, in Assets > KinectMocapFbx > Models > UChar1.fbx, the preview has no play button now. It seems the cause might be related to the Rig being set to Generic, so I set it to Humanoid / CopyFromOtherAvatar / UChar0Avatar (so I can mask-out the lower body). Do you support applying a Humanoid rig?

      • There is no need to enable ‘Get face model data’, as to me. This turns on the face HD-data tracking, like face model, etc. For simple head rotations, just enabling FacetrackingManager should suffice.

  4. Hi there, i am a Student at Falmouth University England, studying Digital games Animation.I have been using your software to have a quick play with my kinect sensor and i am really impressed with the results, I am having great fun and was wondering if there is any info on changing the character in the scene to one that i have made myself? A tutorial or a written guide would be awesome. Many thanks.

  5. Sorry, I just found this in the Readme pdf:
    > Optionally, look at the KinectFbxRecorder-component of KinectController-game object. You may customize
    its settings, if you like to change (for instance) the saved animation clip name, the name of the input or
    output fbx-files, the output file format, tracked player index, etc.

    But when I change the Load Path to Assets/KinectMocapFbx/Models/LiveReplay.fbx, the Save Path to Assets/KinectMocapFbx/Models/Recorded.fbx, and rename the fbx files in Assets to match, then after recording, Hierarchy shows UChar0 instead of LiveReplay. And the rigging of Recorded is Generic.

    • Yes, you need to do both:
      1. Change the file names in KinectFbxRecorder-settings.
      2. Rename the respective fbx-files in the specified Assets folder.
      3. Optionally rename the animator controller for recorded file in the assets-folder, too.
      4. Run the scene, and you will get the object LiveReplay(clone) in Hierarchy.
      5. The rigging of Recorded after mocap is set to Generic, because the file gets overwritten, but you can duplicate the recorded file in Assets, and change its rigging to Humanoid.

      • For step 4, Hierarchy shows Recorded and UChar0, which should be LiveReplay.

        To reduce the size of the fbx as much as possible to just the animations, I’ve set Model > Import Materials = off. Can you suggest a way to exclude the mesh also?

        When I change the rigging (on a duplicate file) to Humanoid/CreateFromThisModel, and then try to mask-off the lower body, I have to click Fix Mask. After masking-off and clicking Apply, the Fix Mask button reappears and the masking is gone. (FWIW, when I try to create cropped clips, that also fails.)

      • These are model specific questions that have little to do with the Mocap Animator. Please see the respective Unity documentation. By the way, is there anything that succeeds, just for a change?!

      • I’d like to help make this plugin as easy to use as possible, then give it a very positive review. I can’t answer yet whether it’s working because my fbx isn’t yet in a usable state, although much of it seems to be on the right track.

        As a tool for creating animations, this plugin would be easier to use if:

        1) The output fbx contained the captured animation clip for a Humanoid avatar and as little else as possible

        2) The game objects were labelled by their purpose, e.g. LiveReplay and Recorded

        These changes would greatly reduce the how-to steps I’m assembling to share with others. Today I’m going to explore starting with a minimal Humanoid model and rigging to see if that results in an FBX that allows masking and sub-clips; if it works, my how-to steps will be longer yet.

        I’m not saying this to criticize your work but to suggest that we can work together to help others.

      • Thank you for the explanation! Of course you are right. And sorry, maybe I was too harsh. I have to answer a dozen of questions each day, and sometimes I become tired to solve all kinds of issues without hearing a single good word for months. My other problem is that I’m not a model designer, and just used the model I had at hand. My goal was to provide simple, easy-to-use and affordable solution for capturing humanoid animations. If you have a simpler model that is not license protected, please send it over to me by e-mail and I may use it in the future Animator releases. You could also explain me in the e-mail (if needed with some screenshots) what exactly happened, and what did you expect instead. I recently added the option specific joints not be saved to the animation clip, as well as saving animation frames at specific fps-rate. This release is not yet published, but I can share it with you, if you like.

  6. Hello! I want to record some sitting animation. Can this mocap animator be set to capture the motion even when I sit in front of the desk or it just capture motions from the fixed distance?

    Also, I have the question about head orientation. Although you had answered before, I can’t get what you mean “For simple head rotations, just enabling FacetrackingManager should suffice”. Can you explain it for more details? For example, should I change any parameters in FacetrackingManager or others.

    Thank you for your assistance!

    • You can mo-cap whatever motions are detected by the sensor. You can see what gets detected on the avatar on the left. What you see, this you will get as recording, too. So, my advice is to experiment a bit.

      You can also disable recording of specific joints. This is a list-setting of KinectFbxRecorder-component. This way you could exclude leg joints from the animation recording, which would be desirable in your case, I think.

      Regarding the head rotations: As you can see (again, on the avatar on the left), KinectManager captures head rotations around X and Z axes, but cannot detect the rotations around Y (i.e. around the neck-head bone). When you enable the FacetrackingManager (you don’t need to change any of its parameters, just enable the FTM-component of KinectController), the rotations around all axes get detected, hence they can be recorded. The FTM-component is disabled ‘by default’, because the face-tracking subsystem of Kinect SDK 2.0 has caused crashes of Unity editor in the past, on some specific machines. Hope this helps you understand.

      • > You can also disable recording of specific joints. This is a list-setting of KinectFbxRecorder-component.

        This would be a good way to do masking. However, I can’t find this list in v1.2 under Hierarchy tab > KinectController > Inspector > Kinect Fbx Recorder, nor under Kinect Fbx Controller nor Kinect Manager

  7. First thanks for this great work. I purshased that on asset store, but I think that I don’t know how to use that. The problem is when I start the scene, I have a grey square in the right side but I can’t see anything about my body, I see the characters (T-Pose and Walking), but I don’t know what that I need to do… My Unity version is 5.5.2 and I installed everything that is described in tutorial. Can you help me?

    • According to your description, I suppose the Kinect sensor is not working, for any reason. Please start ‘SDK Browser 2.0’ (part of the Kinect SDK 2.0, you should have installed), and run ‘Kinect Configuration Verifier’ to check if the sensor is running and its data is received correctly.

      Otherwise, see the Readme-file in the package, on how to use the Mocap animator. Generally, you can start/stop animation recording with voice commands or key presses, and then retarget the saved animations to the characters in your Unity projects, or edit them in external 3d tools. I’ll post more info regarding the Mocap animator on the k2docs-online documentation soon.

  8. Dear Rumen,

    Thank for your excellent work for the motion capture plugin!

    When I utilize the Mocap plugin, I met some problems. When I recorded the squating motion, the avatar’s hip remains fixed so the feet were above the ground, which seems very weird.

    Could you give me some suggestions to solve this issue?

    Thank you!

    • I suppose you mean the recorded avatar. Anyway, I think the root motion is saved into the recorded animation, so when you retarget the animation to a humanoid model in your real scene, and enable the ‘Apply root motion’-setting of its Animator-component, the squat should look OK. This may be seen in the animation preview window as well, when you select the recorded model in Assets/Models-folder.

      • Thank you for your suggestion. When I retargeted the animation to another avatar, the squating motion seems more natural. One more question, I enabled the Kinect recrder player(Script) which attached in the Kinect controller object. I want to check the data of the file (bodyRecoding.txt), however I can’t find the file in the asset. Does this txt file have been generalized when I finished the recording? Where to find the file? The data in the file is about the joints position data?

        Thank you!

  9. Dear Rumen

    I love your asset and it works very well.
    This time I am trying to get the quaternion of each joint and display them live on the game scene.
    So I am wondering if I could get the data from the kinect and use that data to check whether the user is doing the right motion.

    Will it be possible and will there be a way to do this?

    Thank you!

    • Calling KinectManager.Instance.GetJointOrientation() should do the job. The last parameter ‘flip’ is the opposite of ‘mirrored movement’, i.e. should be ‘true’ for normal facing avatar and ‘false’ for mirrored one.

      • long userId = KinectManager.SetPrimaryUserID();
        /// User ID
        /// Joint index
        /// If set to true, this means non-mirrored rotation
        KinectManager.Instance.GetJointOrientation(userId, (int)KinectInterop.JointType.Head);

        Is this the right way?
        I keep getting compile errors when I play the scene.

        Thank you.

      • As far as I see, the last parameter of GetJointOrientation()-method invocation is missing in the code above. Maybe this causes the compile errors. I think the last parameter (flip) should be true in your case.

  10. This was our original code that we were going to use

    using UnityEngine;
    using System.Collections;
    using UnityEngine.UI;

    public class CheckBodyAngle : MonoBehaviour

    public GameObject objJoint;
    string strState = “Ready”;
    string strResult = “Check”;
    public Text textState;
    public Text textRotate;

    void Start()


    void OnGUI()
    GUI.Label(new Rect(0, 150, Screen.width, 50), strState);
    GUI.Label(new Rect(0, 200, Screen.width, 50), strResult);

    void Update()
    if (objJoint != null)

    strState = objJoint.transform.localRotation.ToString(); // + “/” +

    // objJoint.transform.rotation.ToString() + “/” +

    // objJoint.transform.localEulerAngles.ToString() + “/” +

    // objJoint.transform.eulerAngles.ToString();

    textRotate.text = strState;

    if (objJoint.transform.localRotation.x > 0.6f && objJoint.transform.localRotation.x < 0.7f)
    strResult = "Danger";
    textState.text = strResult;
    strResult = "Check";
    textState.text = strResult;


    But in your asset, I can't find the avatar's joint to drag to the Inspector window.

    Is there any way to solve this?

    Thank you.

    • Do you mean the Mocap animator or a scene from the K2-asset? In the Mocap animator, the LiveReplay-avatar is instantiated at runtime, so you should assign the objJoint-reference at runtime, too.

  11. Dear Rumen,

    After I enabled the KinectRecorderPlayer script attached on the KinectController game object, I can’t find the BodyRecording.txt file. I want to check the data of the file. How can I get the file? Could you give me some hints?


    • Sorry, I missed answering this question before. I personally use the KinectRecorderPlayer-component to test animation recording. The input text file should be present in the project folder. By default it is called ‘BodyRecording.txt’, and contains body movements of all available users. This file is normally saved by the KinectRecorderDemo-scene of the K2-asset. But you can also use KinectRecorderPlayer-component in the mocap animator to save one, too. To do it, create a script with reference to KinectRecorderPlayer-component and call its methods StartRecording() to start file recording, and StopRecordingOrPlaying() to stop recording. Then you will have a file to replay in the mocap animator.

  12. Thanks to your help, I solved the problem.
    Now, I am wondering if I could give a range to the rotation angle of each joint.
    So if the user gets into the range, the word “Check” will appear.

    Thank you.

  13. Dear Rumen,

    After I get the recoreded motion utilizing the MoCap asset, I want to edit the animation. In unity 5, I found the animation under Assets/Models. Actually I DON’T want the animation in looping mode, so I unchecked the “Loop Time”, and then click “apply”. However, after clicked the “Apply” button, the “Loop Time” automatically checked again. So in my scene, the avtar still do the animation in loop mode. It seems that I can’t uncheck the “loop time“ setting. Could you tell me how to cancel the loop setting?

    Thanks a lot!

    • Yes, this is so by design. The looping animation in the scene is only to demonstrate what has been recorded. It is not always possible to see the animation in full, if it plays only once.
      Actually, you don’t need this demo-animation. Instead, copy the model with recorded animation to your project, and then retarget the animation to the humanoid model in your scene. To do it, follow these steps:

  14. Dear Rumen,

    I utilized the MoCap asset to record the motion and record the position data at the same time, I found the avatar’s motion is much slower than my real motion, especially when the real motion is a little bit fast. And I checked the data which recorded at the same time, one second only 6 frames of data was recorded. I guess the low frequency caused the huge delay.

    But compared the Avatar demo in your K-2 asset, the avatar can follow the real motion well, there is no obvious delay. How to deal with the delay and increase the data per second in the MoCap asset? Each second 6 frames of data is too small.

    Thank you for your help

    • Hi, please set both ‘Joint distance threshold’ and ‘Joint angle threshold’-settings of KinectFbxRecorder (component of KinectController game object) to 0, and then try again.

  15. Hello Everyone:

    Is it possible in Maya or Modo v10.4 to save the animations
    in the bvh file format?

    Thank you,


  16. Hello Rumen,

    is it possible to get your animator free for trying some things out with mocap. I just want to get some little shots for my project (around 3) and dont want to spend money on that little shots.

    Thank you in advance

    Yours Faithfully

    • Hello Simon, are you a student, lecturer or researcher? If you are, you are eligible to get any of the packages here free of charge for academic use. Just send me your e-mail request from your university e-mail address, or provide some other proof you are really studying there.

  17. Dear Rumen,

    when I utilize MoCap, I found that if I didn’t check “Grounded Feet” in “Kinect Fbx Recorder” script, the squat motion would be very strange. I mean, from standing to squat, the feet would go inside the ground; and from squat to standing, the feet would be above the ground. Actually, my feet didn’t move.

    I also checked the recording data, the position data of feet seems no problem. I know “Grounded Feet” will make the feet stick to the ground. So my question is that, when converting the Kinect data to Fbx animation file, why feet motion is so strange when doing the squat?

    Thank you!

    • It is like this, because by default the AvatarController-component (responsible for avatar movements) applies forward kinematics. It starts from the position of the waist and then applies the rotations of different joints down the hierarchy. The feet are the last. So, depending on the accuracy of user tracking and your own height and bone lengths, it is possible the feet of the model to go into or over the ground. That’s why there is this setting – ‘Grounded feet’ that keeps model’s feet on the ground. But keep in mind that even with the setting turned on, it is possible to have short times of feet into or over the ground. This is because there is a time threshold of 0.2 seconds, when the feet are allowed to be not on the ground (for instance, when the user jumps). If you want to lower or clear this setting, open KinectScripts/AvatarController.cs and look for the definition of ‘maxFootDistanceTime’-constant.

  18. Hello

    I had both the Mocap and it works good, I am very interested in give different physical properties to my avatars just as the video shows, does anyone know a tutorial for that, I really want to get some of the same affects are in the video

    • The Mocap-animator just creates animation clips you can reuse in your other projects (as running in this artistic clip). Instead of looking for tutorials, why don’t you contact the author directly and ask him for details and advice. Here is his YouTube-channel: His name is Ricardo, and he is very creative and supportive person.

  19. Dear Rumen,

    I really feel sorry about asking question of this unity package now, because I know you will not support this package anymore. But I have to ask because it’s very important for me. For my purpose, I have to replay the motion of Kinect-controlled avatar in the same scene rather than retargeting the motion in another project, so I made the project based on your demo scene. But I find all the motions (both Kinect captured animation and external animations) can’t be edited in that scene. Animations can’t be split and I also can’t check the “loop time”. But if I create a new unity project, for the same animation, the editing can be successful. Could you just give me some hints so that I can do the animation editing (split and uncheck the loop)? Thank you for your help.

    • There is a script in KinectMocapFbx/Scripts-folder called ‘ModelPostProcessor.cs’. It prevents the application updates in the same project. I’ve put it there to prevent some issues when overwriting the same animations. If you want to modify the animation parameters, delete this script. But be aware there may be issues when overwriting the same animation multiple times, so you should provide another workaround in that case.

  20. Pingback: Kinect v2 Examples with MS-SDK and Nuitrack SDK | - Technology, Health and More

  21. Hello, I’m french, I’m teacher, I try to use your mocap recorder scene with Unity, but righ leg don’t record and stay bending. Do you known what the problem become?

    • Hi, please open ‘AvatarController.cs’ in ‘K2Examples/KinectScripts’-folder and modify line 351 like this:

      if (!boneIndex2MecanimMap.ContainsKey(boneIndex) /** || boneIndex >= 21 */)

  22. Halo there, i tried to record using recorderdemo-scene from k2example asset, after saying “record” it record, but after stopped the scene, i couldn’t find the fbx recorded animation file

      • You can’t. In the MocapAnimatorDemo-scene you can see though how to utilize the .anim-file to animate another humanoid model. If you need the animation as FBX file, please look at the Kinect MoCap Animator-package. Please e-mail me, if you want to get it from me. To utilize it, you would need a recording from either the K2- or K4A-assets.

Leave a Reply