Kinect Mocap Animator is simple to use Kinect (‘Kinect for Xbox One’ or ‘Kinect for Xbox 360’) mo-cap tool that records user motions as fbx animation clip. It features optional hand and finger tracking, by utilizing a LeapMotion sensor. You can run the tool directly in Unity editor, to record animations for the humanoid models in your own projects and scenes. ‘Kinect MoCap Animator’ works with Kinect-v2 or Kinect-v1 (aka ‘Kinect for Xbox One’ & ‘Kinect for Xbox 360’), and optionally with the LeapMotion sensor, if finger animation is needed as well. It saves the recorded animation in fbx-file of your choice. The recorded animation may be retargeted to other humanoid models in your Unity projects and scenes, or may be processed further in external 3d modeling tool, like Maya or 3dsMax. This tool can be used in both Unity Pro and Unity Personal editors.
The Mocap animator got a very bad review on Unity asset store from the so called “expert” Ali Zanjiran (aka Ali Yeredon, aka MasterEdon) He rated the asset as buggy and unusable. After that my decision was to take the Mocap animator off the Asset store, as of 10.Nov.2017. It is no more available to anyone, no matter for academic, private or commercial use. The free support for existing customers will be valid till the end of 2017. As a possible replacement, prepare some money and see the suggested by the same “expert” ‘iPi Mocap Studio’. As to me iPi is OK, but quite expensive, offline-only mo-cap system, without finger tracking.
Free for education (N/A as of 10.Nov.2017):
The package is free for academic use (i.e. in schools, colleges and universities, by students, teachers or researchers). If you match this criterion, please e-mail me to get the K2-MoCap animator directly from me.
My only request is NOT to share the package or its scenes in source form with others, without my explicit consent, regardless of whether you purchased it at Unity asset store, or got it from me free of charge for academic use. Please respect my work.
As of 10.Nov.2017: Sharing the asset online or offline, or publishing it in public repository separately or as part of other project will be considered as license infringement.
How to run ‘Kinect MoCap Animator’:
1. Install the Kinect for Windows SDK v2.0 or Kinect Runtime. The download link is below.
2. Download and install the Speech Platform Runtime v11 (both x86 and x64 versions). The download link is below.
3. Download and install the en-US Kinect language pack. The download link is below.
4. If finger tracking is needed, and you possess a LeapMotion-sensor, please download, unzip and install the LeapMotion Orion Setup. The download link is below.
5. Import ‘Kinect Mocap Animator’ into a new Unity project.
6. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’.
7. Open and run the Assets/KinectMocapAnimator-scene.
More information can be found in the Online documentation, as well as in the Readme pdf-file in the package.
* Kinect for Windows SDK 2.0 (Windows-only) can be downloaded here.
* Please mind Kinect v2 requires Windows 8 or later, as well as USB-3 port on your machine.
* MS Speech Platform Runtime v11 (both 32- and 64-bit versions) can be downloaded here.
* Kinect for Windows SDK 2.0 language packs can be downloaded here.
* LeapMotion Orion Setup can be downloaded here.
* If you use Kinect-v1 with Kinect SDK 1.8, but have also installed the Kinect SDK 2.0, please look at this tip.
How to use the LeapMotion-sensor for finger tracking:
You can use the LeapMotion-sensor to track the user’s fingers and record the full finger motion in the animation. To do it, follow the steps here.
How to utilize the recorded animations:
Please mind you don’t need to replace the models in the Mocap-Animator with your own, just to utilize the recorded animation(s). Instead, use the retargeting capabilities of Unity animation system Mecanim, like this:
1. Copy the model with the recorded animation (usually Assets/Models/Recorded.fbx) to the Unity project, where you plan to utilize the animation.
2. Select the copied model in Unity editor, switch to the Rig-tab of its Import settings, and make it Humanoid.
3. Go to the Animations-tab of the Import settings now, and make the needed changes, like setting the Start & End frames, Loop time, etc. Apply the changes.
4. Open the scene where your model should play the animation, and select it in Hierarchy. Its rig in Import settings should also be Humanoid.
5. Make sure the model has an Animator-component, and there is animator controller assigned to it. If not, create a new animator controller in assets, and assign it to the Controller-setting of the Animator-component.
6. Open the Animator-window (menu Window / Animator).
7. Drag the recorded animation (or the entire model with animation from p.1) and drop it into the Animator window. The recorded animation should appear in Animator’s state machine. That’s it.
8. Run the scene and check, if your model plays the recorded animation correctly. This way you can retarget the recorded animation(s) to all other humanoid models in your project, without physically recording the animation clips into them.
* The online documentation of the ‘Kinect Mocap Animator’-package is available here.
* The official release of ‘Kinect Mocap Animator’-package is available at the Unity Asset Store. All updates are free of charge.
Output File Formats:
To get the output file in the format you need, change the ‘Output file format’-setting of KinectFbxRecorder (component of the KinectController game object in the scene). Don’t forget to change the extension of the saved file in ‘Save file path’-setting, if needed. Keep in mind that in this case you may not see properly the results of the mo-cap in the scene. Here are the available output file formats:
- FBX binary – saves fbx-file in binary format;
- FBX ascii – saves fbx-file in text format;
- FBX 6.0 binary – saves fbx-file v6.0 in binary format. This is the by-default setting;
- FBX 6.0 ascii – saves fbx-file v6.0 in text format;
What’s new in version 1.5:
1. Added ‘Display Leap Motion camera’-setting, to turn on or off LM camera on background.
2. Fixed finger tracking with the LeapMotion sensor, when the user turns left or right.
3. Updated UI canvas, to make it visible in VR mode. Distance from the camera is adjustable.
4. Updated ‘Joint distance threshold’ and ‘Joint angle threshold’-settings to 0 by default.
5. Updated the package to utilize the latest KinectScripts – v2.14.