Kinect with MS-SDK

SensorKinectMsSDKKinect with MS-SDK is a set of Kinect v1 examples that uses several major scripts, grouped in one folder. It demonstrates how to use Kinect-controlled avatars, Kinect-detected gestures or other Kinect-related stuff in your own Unity projects. This asset uses the Kinect SDK/Runtime provided by Microsoft. For more Kinect v1-related examples, utilizing Kinect Interaction, Kinect Speech Recognition, Face Tracking or Background Removal, see the KinectExtras with MsSDK. These two packages work with Kinect v1 only and can be used with both Unity Pro and Unity Free editors.

Known Issues:
Unity 5.0 and later introduced an issue, which causes in many cases Kinect tracking to stop after 1-2 minutes with DeviceNotGenuine error. As a result you will see that the Kinect-enabled game freezes. There is no definite workaround yet. My advice is: If you encounter this issue, please install Unity 4.6 along with Unity 5 (just in another folder) and use the K1-assets in Unity 4 environment.

Update 12.Aug.2015: The bug-report, along with all details, is in the hands of Unity staff now, for reproduction and fix.
Update 07.Sep.2015: Here is what I got from the Unity support stuff: “We have checked that other people have the issue outside of Unity also, the entire device goes in a broken state. That indicates a driver bug. If we pop up the windows device manager, we can see the device is broken after it gets into this state and even unplugging it and plugging it back in doesn’t fix it so it’s an issue on Microsoft’s driver.”
Update 10.Sep.2015: One other asset user – Thank you Victor Grigoryev! – has experimented with multiple configurations. Here are his conclusions:

  • If I connect the Kinect to USB 3.0 on Windows 8 and 10, it works perfect. But the issue was on my Windows 7 machine with USB 2.0 only (no USB 3.0 on motherboard at all).
  • When I tried to connect to USB 2.0 on Windows 8 or 10, I’ve seen some lags from kinect, but it still worked.
  • On Win 7 – I reinstalled Kinect SDK and RESTARTED the computer then. Without restarting nothing will change. I checked that several times. And not to restart was one of my major mistakes before.

Update 29.Oct.2015: I posted the issue on the MSDN Kinect-v1 forum here. Please up-vote the post and comment it with your own case and machine configuration (CPU, OS, USB port, sensor, Kinect SDK, Unity version, etc.), if you suffer from this issue. This is the last thing we could do, in order to ask the Kinect staff provide a workaround or SDK update.

Last Update, FYI: The best workaround for this issue, as reported so far by different users, is as follows: Uninstall all Kinect-v1 related packages (SDKs, drivers, OpenNI, NiTE, Zugfu, etc.), restart, install Kinect SDK 1.8 only, and restart again.

How to Run the Example:
1. Install the Kinect SDK 1.8 or Runtime 1.8 as explained in Readme-Kinect-MsSdk.pdf, located in Assets-folder.
2. Download and import this package.
3. Open and run scene KinectAvatarsDemo, located in Assets/AvatarsDemo-folder.
4. Open and run scene KinectGesturesDemo, located in Assets/GesturesDemo-folder.
5. Open and run scene KinectOverlayDemo, located in Assets/OverlayDemo-folder.
6. Open and run scene DepthColliderDemo, located in Assets/DepthColliderDemo-folder.

Download:
The official release of ‘Kinect with MS-SDK’-package is available in the Unity Asset Store.
The project’s Git-repository is public and is located here. This repository is private and its access is limited to contributors and donators only.

Troubleshooting:
* If you need integration with the KinectExtras, see ‘How to Integrate KinectExtras with the KinectManager’-section here.
* If you get DllNotFoundException, make sure you have installed the Kinect SDK 1.8 or Kinect Runtime 1.8.
* Kinect SDK 1.8 and tools (Windows-only) can be found here.
* The example was tested with Kinect SDK 1.5, 1.6, 1.7 and 1.8.
* Here is a link to the project’s Unity forum: http://forum.unity3d.com/threads/218033-Kinect-with-MS-SDK

What’s New in Version 1.12:
1. Updated AvatarController to use the Mecanim configured bones. Big thanks to Mikhail Korchun!
2. Added AvatarControllerClassic-component to allow manual assignment of bone transforms. Big thanks to Aaron Brooker!
3. Added ‘Offset relative to sensor’-setting to AvatarController and AvatarControllerClassic, to provide the option to put the avatar into his real Kinect coordinates. Big thanks to Claudio Rufa!
4. Added depth-collider demo scene, to demonstrate the mapping of Kinect space and depth coordinates to Unity world coordinates, and how this can be used for VR collisions.
5. Added gestures debug-text-setting to KinectManager to enable easier gesture development.
6. Updated detection of the available gestures, to make them more robust and easier to use.
7. Fixed sensor initialization, when the speech manager from KinectExtras is integrated.

Playmaker Actions for ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’:
And here is “one more thing”: A great Unity-package for designers and developers using Playmaker, created by my good friend Jonathan O’Duffy from HitLab-Australia and his team of talented students. It contains many ready-to-use Playmaker actions for Kinect v1 and a lot of example scenes. The package integrates seamlessly with ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’-assets. I can only recommend it!

163 thoughts on “Kinect with MS-SDK

  1. Hi Rumen,

    thanks for the great free tools! 🙂 a couple of questions..

    I see the green skeleton / cube man joint positions the hip position and feet seem quite different in the 3d avatar, when I lift my arms up and down the 3d avatar moves up and down.. also when I lean left/right the characters feet move in opposite direction, it appears to be floating rather than planted as the cube man is..

    could you tell me if this is as good as it gets or if there is a way to adjust this?

    I got it working on my own avatar and the issues are still there..

    if not are these issues resolved in the kinect v2 version? (is it worth me upgrading for that reason?)

    thanks and kind regards!

    Alex

    • Hi Alex, there are some differences between the cube-man and the 3d avatar. While the cube-man shows the positions of the Kinect joints, the avatar uses only the body central joint (hip-center) and the joint orientations. Probably the hip-center position changes a bit, when you lift your hands, and this leads to avatar movements up and down. Also, the bones of the avatar are oriented in space by using forward kinematics (i.e. from hip-center to the limbs, not vice versa), which causes these sometimes incorrect limb positions. If you need to have the feet grounded all the time, maybe a script (or package) for inverse kinematics (IK) would provide some help. In the K2-package the algorithm is similar, but the K2-sensor is more precise than K1. That’s the difference. I think it’s not worth upgrading just because of this avatar issue, if you don’t possess a Kinect v2 sensor.

  2. hi rumen, thanks for the reply.. maybe the hip shifts a little but as it is it seems quite exaggerated, more so when lifting both arms, i see theres a vertical position toggle, do you think at some point you could add a scaling function so you can go from no vertical movement to 100%? would allow a bit more tuning.. regarding the I.K will it be possible to add bone control on top of the avatar controller? Also I found I needed to use the avatar classic controller and manually map it when using an autodesk character generator character or it was pinned to the spot when running the game.. is this correct approach? sorry for all the ?? I am a unity noob but have got some fast results so I am interested to take it further..

  3. hi rumen , i want to develop a virtual dressing room with your sdk . please tell me if it is possible and if yes how to achieve it

  4. Hi Rumen, I would like to know if there is any way to overlay an object not only moving it with the user, but also rotating it acordingly. Say, for a basic example, having a cube over a persons head and have it rotate as he rotates the head. How can I go about it? Any way to get the rotation data necessary for it?

    Thanks.

    • The 2nd face-tracking-demo in Kinect v2-asset does what you describe (it works with Kinect v1, too). To do it with the K1-asset you, need to get the joint rotation of the neck (the function is KinectManager.Instance.GetJointRotation(), as far as I remember), and then apply the rotation to the cube.

  5. Hello ! im making a virtual dressing room project using kinect v1 and in your asset there is an overlay demo and im using it the question is how could i match clothes to the person who is standing in front of the screen? thanks in advance ! :)

  6. Hi Rumen,
    Thanks so much for the great tools.
    I’m trying to make a game in which the user can move the player character forward by swinging their arms.

    If this is too complicated, I’d like to make the character move forward when the player raises their arms.
    Let me know if you know how to figure this out.
    Thanks,
    Sam

    • Hi. If this is the way you want to control the player, then you need gestures and a gesture listener. In the gesture listener (see the GestureListener-component in KinectGesturesDemo as example), you could move your character forward, when a discrete gesture (like raising arm) is completed, or while continuous gesture (like wheel) is in progress. You can also define your own gesture for detecting swinging arms, if you need it.

  7. Can I request this package “KinectExtras with MsSDK” ? because this package is no longer available in Unity Asset Store. I want to learn about kinect very much , but dont know to start with it until i found your package, and I want to learn something from this playmaker, but its require that missing package T.T

  8. HI,I want to use two device work together in a scene, I saw useMultiSourceReader this parameter. But don’t know how to use for 2 to access data flow. What is the reference?Thank You

    • Hi, I suppose you mean the K2-asset. No, you cannot use more than one Kinect, moreover only one sensor can be connected to a single PC/notebook. The ‘Use multi-source reader’-setting concerns stream source synchronization (i.e. color, depth, coordinate mapping), which is important when one needs color camera overlays or background removal.

  9. Hi Rumen
    I have a problem with this asset and is that my kinect doesn’t start the tracking the KinectAvatarsDemo scene only shows Waiting for users, any idea of the problem or coudl you help me please

    • Restart the computer and check if Kinect is powered up and connected to the PC. Then run the ‘Developer toolkit browser’ (part of Kinect SDK 1.8) and try if several of the samples in there work (at last the color, depth & skeleton demos). If they work, try the avatars demo in Unity again. Otherwise look for possible hardware/SDK issue.

      • I tried to debug but it does not go into the Kinect Gesture squat case and this code is correct because it already came with the project.

      • In this case, are you sure the Squat-gesture is configured for detection? For instance, as far as I remember, there is a component in the avatars-demo called SimpleGestureListener (or GestureListener in gestures-demo). In the UserDetected()-method of this script there are calls to KinectManager.DetectGesture()-method to define the gestures tracked in this scene. The Squat-gesture should be there, too.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s