Kinect Mocap Animator is a simple motion capturing tool that records user motions into fbx animation clips, when run in Unity editor. It provides optional hand and finger tracking, if a LeapMotion sensor is also available. The tool records animations that may be retargeted later on other humanoid models in your own Unity projects. Apart of the Kinect-v2 and v1 sensors, this package supports Intel RealSense D400-series, as well as Orbbec Astra & Astra-Pro sensors via the Nuitrack body tracking SDK. Please mind the Nuitrack SDK is in early stage, and issues are possible.
‘Kinect MoCap Animator’ works with Kinect-v2 & Kinect-v1 (aka Kinect for Xbox One & Kinect for Xbox 360), Intel RealSense D415 & D435, Orbbec Astra & Astra-Pro, and other Nuitrack supported sensors, as well. It can utilize the LeapMotion finger tracking, if finger animation is needed. The package can be used in all versions of Unity – Free, Plus & Pro.
Free for personal use to the K2-asset customers (as of 17.Aug.2018):
As of 17.Aug.2018, I provide the ‘Kinect Mocap Animator’-package free of charge for personal use to anyone, who requests it by e-mail and has a valid invoice for the K2-asset. Please mind: The free packages have no e-mail support included.
How to run ‘Kinect Mocap Animator’:
1. (Kinect-v2 & v1) Download and install the Kinect v2 or v1 SDK. The download links are below.
2. (Kinect-v2) If you want to use Kinect speech recognition, download and install the Speech Platform Runtime, as well as EN-US plus any other needed language packs. The download links are below.
3. (Nuitrack or Orbbec) If you want to work with Nuitrack body tracking SDK, look at this tip. If you want to work with Orbbec Astra or Astra-Pro sensors via OpenNI2, look at this tip.
4. Import this package into new Unity project.
5. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’.
6. Make sure that Direct3D11 is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
7. Open and run the Assets/KinectMocapAnimator-scene in Unity editor.
More information about the Mocap Animatorcan be found in the Online documentation, as well as in the ‘Readme-Kinect-Mocap-Animator.pdf’-file in the package.
* Kinect for Windows SDK 2.0 (Windows-only) can be downloaded here.
* MS Speech Platform Runtime v11 (both 32- and 64-bit versions) can be downloaded here.
* Kinect for Windows SDK 2.0 language packs can be downloaded here.
* LeapMotion Orion Setup can be downloaded here.
* If you use Kinect-v1 with Kinect SDK 1.8, but have also installed the Kinect SDK 2.0, please look at this tip.
How to use the LeapMotion-sensor for finger tracking:
You can use the LeapMotion-sensor to track the user’s fingers and record the full finger motion in the animation. To do it, follow the steps here.
How to utilize the recorded animations:
Please mind you don’t need to replace the models in the Mocap-Animator with your own, in order to utilize the recorded animation(s). Instead, use the retargeting capabilities of Unity animation system Mecanim, like this:
1. Copy the model with the recorded animation (usually Assets/Models/Recorded.fbx) to the Unity project, where you plan to utilize the animation.
2. Select the copied model in Unity editor, switch to the Rig-tab of its Import settings, and make it Humanoid.
3. Go to the Animations-tab of the Import settings now, and make the needed changes, like setting the Start & End frames, Loop time, etc. Apply the changes.
4. Open the scene where your model should play the animation, and select it in Hierarchy. Its rig in Import settings should also be Humanoid.
5. Make sure the model has an Animator-component, and there is animator controller assigned to it. If not, create a new animator controller in assets, and assign it to the Controller-setting of the Animator-component.
6. Open the Animator-window (menu Window / Animator).
7. Drag the recorded animation (or the entire model with animation from p.1) and drop it into the Animator window. The recorded animation should appear in Animator’s state machine. That’s it.
8. Run the scene and check, if your model plays the recorded animation correctly. This way you can retarget the recorded animation(s) to all other humanoid models in your project, without physically recording the animation clips into them.
* The online documentation of the ‘Kinect Mocap Animator’-package is available here.
* The ‘Kinect Mocap Animator’-package is available on request, if you have a valid invoice for the K2-asset.
Output File Formats:
To get the output file in the format you need, change the ‘Output file format’-setting of KinectFbxRecorder (component of the KinectController game object in the scene). Don’t forget to change the extension of the saved file in ‘Save file path’-setting, if needed. Keep in mind that in this case you may not see properly the results of the mo-cap in the scene. Here are the available output file formats:
- FBX binary – saves fbx-file in binary format;
- FBX ascii – saves fbx-file in text format;
- FBX 6.0 binary – saves fbx-file v6.0 in binary format. This is the by-default setting;
- FBX 6.0 ascii – saves fbx-file v6.0 in text format;
What’s new in version 1.7:
1. Revived the Mocap-animator package, with support to many motion capturing sensors – Kinect v2 & v1, RealSense D415 & D415, Astra & Astra-Pro.
2. Updated the package to use the latest KinectScripts – v2.17.3.
3. Updated the package to require Unity editor v2017.1.0f3 or newer.