Kinect v2 Examples with MS-SDK

Kinect2_MsSDKKinect v2 Examples with MS-SDK  is a set of Kinect-v2 (aka ‘Kinect for Xbox One’) examples that use several major scripts, grouped in one folder. The package contains over thirty demo scenes. The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture-demos – how to use the programmatic or VGB gestures, fitting room demos – how to create your own Kinect dressing room or character overlays, etc. You can find short descriptions of all demo-scenes in the K2-asset online documentation. This package works with Kinect-v2 and Kinect-v1 (aka Kinect for Xbox One & Kinect for Xbox 360), supports Windows 32- and 64-bit builds and can be used in both Unity Pro and Unity Personal editors.

Free for education:
This package is free for academic use (i.e. in schools, colleges and universities, by students, teachers or researchers). If you match this criterion, please e-mail me to get the K2-asset directly from me.

One request:
My only request is NOT to share the package or its demo scenes in source form with others, without my explicit consent, regardless of whether you purchased it from the Unity asset store, or got it from me free of charge for academic use. Please respect my work.

Customer support:
* First, please check if you can find the answer you’re looking for on the Tips, tricks and examples page, as well as on K2-asset Online Documentation page. See also the comments below the articles here, or in the Unity forums. If it is not there, you may contact me, but please don’t do it on weekends or holidays. This is the only time I can have some rest.
* If you e-mail me, please include your invoice number. I provide free e-mail support for the first 3 months after the product purchase (limited to max 10 issues). More information regarding e-mail support can be found here.
* Please mind, you can always update your K2-asset free of charge, from the Unity Asset store.

How to run the demo scenes:
1. Install the Kinect for Windows SDK v2.0 or Kinect Runtime. The download link is below.
2. If you want to use Kinect speech recognition, download and install the Speech Platform Runtime (both x86 and x64 versions), as well as the needed language pack. The download links are below.
3. Import this package into new Unity project.
4. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’.
5. Make sure that ‘Direct3D11’ is the first option in the ‘Graphics API’-list, in ‘Player Settings / Other Settings / Rendering’.
6. Open and run a demo scene of your choice from a subfolder of the ‘Assets/KinectDemos’-folder. The short descriptions of all demo scenes are available here.
7. To build Windows store UWP-10 project, follow the steps in this tip.

* The official online documentation of the K2-asset is available here.

* The official release of the ‘Kinect v2 with MS-SDK’ is available at the Unity Asset Store.

* Kinect for Windows SDK 2.0 (Windows-only) can be downloaded here.
* MS Speech Platform Runtime v11 (32- and 64-bit versions) can be downloaded here. Please install both x86 and x64 versions, to be on the safe side.
* Kinect for Windows SDK 2.0 language packs can be downloaded here. The language codes are listed here.

* If you get compilation errors like “Type `System.IO.FileInfo’ does not contain a definition for `Length’”, you need to set the build platform to ‘Windows standalone’. For more information look at this tip.
* If the Unity editor crashes, when you start demo-scenes with face-tracking components, look at this workaround tip.
* If the demo scene reports errors or remains in ‘Waiting for users’-state, make sure you have installed Kinect SDK 2.0, as well as the other needed components, and the sensor is connected.

* Here is a link to the project’s Unity forum:
* Many Kinect-related tips, tricks and examples are available here.
* The official online documentation of the K2-asset is available here.
* These examples were developed with Kinect for Windows SDK v2.0 – dev preview 1404-1407, public preview 1409 and release 1410. To run the latest version of the K2-asset, please install the latest release of Kinect v2 SDK.

Known Issues:
* If you get compilation errors, like “Type `System.IO.FileInfo’ does not contain a definition for `Length’“, open the project’s Build-Settings (menu File / Build Settings) and make sure that ‘PC, Mac & Linux Standalone’ is selected as ‘Platform’ on the left. On the right side, ‘Windows’ must be the ‘Target platform’, and ‘x86’ or ‘x86_64’ – the ‘Architecture’. If Windows Standalone was not the default platform, do these changes and then click on ‘Switch Platform’-button, to set the new build platform. If the Windows Standalone platform is not installed at all, run UnityDownloadAssistant again, then select and install the ‘Windows Build Support’-component.
* If you experience Unity crashes, when you start the Avatar-demos, Face-tracking-demos or Fitting-room-demos, this is probably due to a known bug in the Kinect face-tracking subsystem. In this case, first try to update the NVidia drivers on your machine to their latest version from NVidia website. If this doesn’t help and the Unity editor still crashes, please disable or remove the FacetrackingManager-component of KinectController-game object. This will provide a quick workaround for many demo-scenes. The Face-tracking component and demos will still not work, of course.
* Unity 5.1.0 and 5.1.1 introduced an issue, which causes some of the shaders in the K2-asset to stop working. They worked fine in 5.0.0 and 5.0.1 though. The issue is best visible, if you run the background-removal-demo. In case you use Unity 5.1.0 or 5.1.1, the scene doesn’t show any users over the background image. The workaround is to update to Unity 5.1.2 or later. The shader issue was fixed there.

What’s New in Version 2.13:
1. Added speech recognition support to the Windows Universal-10 builds.
2. Added KinectProjectorDemo scene based on RoomAliveToolkit calibration, to demonstrate 3D overlays over the real users, from projector point of view.
3. Added ‘Apply muscle limits’-setting to the AvatarController-component, to limit the avatar bone rotations to the muscle limits, as set-in-Unity avatar definitions.
4. Added support for pausing apps in Windows Universal-10 builds.
5. Added ‘Skip remote avatars’ and ‘Estimate joint velocities’-settings to KinectManager (courtesy of Andrew/Andrzej W).
6. Added ‘External head rotation’-setting to the AvatarController-component (courtesy of Andrzej W).
7. Added ‘Raised right horizontal left hand’ and ‘Raised left horizontal right hand’-poses to the KinectGestures-component (courtesy of Andrzej W).
8. Added UserHandVisualizer-script component, to allow 3d visualization of the user’s hands.
9. Updated FacetrackingManager-component, to optimize the HD face-model updates.
10. Fixed KinectHolographicViewer-scene, when the user moves forwards or backwards.

Videos worth more than 1000 words:
Here is a video by Ricardo Salazar, created with Unity5, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3:

..a video by Ranek Runthal, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3

..and a video by Brandon Tay, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.0:

551 thoughts on “Kinect v2 Examples with MS-SDK

  1. Thank you for the great asset. How can I put 3D-objects in front of background removal, so that the user could “hide” behind object.

  2. Hi Rumen, Is possible use the playmaker actions with kinect v2?, I have a complex project in kinevt v1 that use this asset and I need to update to run with kinect 2. Tks

    • Hi, there are some example PM actions in the K2-asset. You can see them in the, located in KinectScripts/Playmaker-folder. They may be unzipped, when the Playmaker-package is also imported. More actions could also be added, when needed. Just use the current ones as example. Please tell me, if you need more information.

  3. Hi, Rumen.
    I need to get a picture with a good quality, but kinect has a poor quality camera, so I think is it possible to combine the Kinect sensor and a good camera. How can I calibrate the Kinect and a camera to make them work together?

    • Not sure, but I suppose you need to map the Kinect color camera image (1920 x 1080) to your quality camera image. By the way, do you mean 1080p is poor quality?

  4. hi
    can you please tell me how to control cursor using of wristleft or WristRigt insted if HandLeft and Hand Right.

  5. Hello
    Your Package has been a bless, kinect is not easy, thanks for your works.
    I just want to share some of my creations with your package:

    Mixed with Leap Motion + cardboard for full body tracking

    3D hair style:



    8bit avatar:

    Atraction Aura:

  6. Hello,Rumen I just purchased this Kinect v2 with MSsdk examples from asset store. But I found some quetions about the KinectAvatarDemo2.unity in the AvatarsDemo-folder. When I run the scene, it always report the RunTime error. I don’t know how to deal with that, and I run the example in the unity3d 5.3.0f4. Besides, I wanna use the kinect v2 examples with avatar charactor on web player, how can I do that? I found there is a KinectDataServer scene in the KinectDataServer-folder, how does it work ?

    • Hi Leon, sorry for the delayed response. I don’t work at weekends and holidays. To your questions:

      The K2-asset can work in Windows standalone mode only, because it calls the Kinect SDK 2.0 API. Please open ‘File / Build Settings’ and make sure ‘PC, Mac & Linux Standalone’ is selected as ‘Platform’ on the left and ‘Windows’ as target on the right.

      The KinectDataServer is the server-app for ‘Kinect v2 VR Examples’-asset (K2VR for short). K2VR is a set of lightweight Kinect-scenes, which get their data over the network instead. It is aimed to work on mobile platforms, such as Android and iOS, and VR-platforms, such as Oculus, GearVR or Vive. If you would like to try it out, please e-mail me your invoice number from Unity asset store and I’ll send it to you. I’ve not tested it on the WebGL-platform so far (i.e. issues are possible), but I’ll try to do it this week. I’m also not sure from security point of view, if a web app should have rights to get private information, such as the movement of human bodies in the room.

  7. Hi Rumen, first of all many thanks for your amazing plugin, it’s working really well!
    For my game I’d like to use two hand cursors (one for each hand), which can be moved seperately and are fully functional on their own. My current approach is to duplicate the Interaction Manager and restrict each to a specific hand. This way I managed to get two cursors, but their both not really functional and quite buggy. Do you have an idea for a better approach, without the use of two separate scripts?

    And what is it about the “lasso” state for the hand cursor? I’ve never seen it in use and also the “Normal Hand Texture” is never used.

    • Hi Jonas, the InteractionManager tracks both hands all the time. The code you need to modify is near the end of its OnGUI()-method and instead of displaying the cursor on cursorScreenPos, use leftHandScreenPos & rightHandScreenPos to display 2 cursor textures.

      Regarding the Lasso-state, it is currently considered as closed hand too, but you are free to modify this in HandStateToEvent()-function of IM. The normal-cursor texture is used only when the user or his hands are not tracked, for instance before the user ever gets detected or gets lost.

      • Hi Rumen, thanks for your reply! I managed now to properly display a hand cursor for each hand with the method you told me. But now I’m struggling with the primaries for the hands. Only one hand at a time can be primary right now. That’s bad, because I want to use both hands for grabbing/ activating stuff. Any tips on how I can set both hands to be primary all the time?

        Thank you in advance!

      • I am using the InteractionDemo1 as a testing enviroment. In the Interaction Manager script I tried to set both isLeftHandPrimary and isRightHandPrimary bools to true, but that didn’t work. After that I tried to modify any code related to the primary bools, but it only accepts one hand as a primary. On the infoGuiText you can see switching the primaries (the * after the L.Hand/R.Hand changes to the other hand if I hide one hand from the Kinect). Also the cursor on the “non-primary” hand is not changing it’s sprite to the grab cursor, but stays in the lasso cursor shape, which is used right now, if the hand is off screen/ not used.

        e.g. I can grab stuff with the left hand. If I hide my left hand, the right hand becomes primary and it changes from the lasso cursor to the release hand cursor. Now i can grab stuff with my right hand. The left hand is still showing up as cursor, but not usable for interaction.The GrabDropScript is only responding to the primary hand. And also it seems like the movement of one hand is affecting the movement of the other hand (just a bit)

      • There are probably 1-2 small mistakes in your code changes. You can debug the code, to find out what is wrong. If you get really stuck, e-mail me next week, mention your invoice number and I’ll try to help you. This week I’m quite busy, and can’t do it. By the way, there will be some changes in the interaction manager, coming with the K2-asset update later this week.

      • Hi Rumen,

        Have you ever followed up on this request by coincidence?
        I would like to implement the same functionality (2 cursors), the problem isn’t displaying them, but making them work at the same time.

      • Yes, we made it work back then, together with Jonas. The problem was that this required duplication of many code parts in InteractionManager. That’s why I never added it to IM-updates. You can debug a bit the code in IM, related to left/right hand cursor control, and you will find out what duplication or changes are needed to make it work.

  8. hi,
    my final year project is running game
    i am using kinect v2 sdk with example and by using avatar demo example i’m able to control avatar but i have one issue that how to control avatar that runs so that its also able to collect coin by its hands

    • Hi, this is a matter of physics and colliders, as to me. Put for instance sphere colliders around avatar’s hands and make it rigid body, as in the KinectAvatarsDemo1-scene. Then add trigger colliders around the coins and finally check for collisions in your coin-collecting script.

  9. Hello Rumen!

    I am making a VR self defense game with kinect in unity for my class project. I have a kinect for xbox 360. Will it run kinect mocap and kinect v2 ms-sdk? If so can I please get kinect v2 with ms-sdk and mocap animator please?

    Omkar Manjrekar

  10. Hi, I bought this asset package to make games, and it has been working out pretty well. However, I am trying to shift to the Universal Windows Platform so that I can target the xbox market. Do you have any plans to support the UWP platform? (On Unity it is labeled as Windows Store – Universal)

  11. Hey, Im making a game that requires keyboar input to write an email adress but it seems the InteractionManager interfers somehow. Im using the UI InputField. Do you have any suggestions of what could be causing trouble?


    • The Unity event system uses only one input module – the one that is enabled and is on top of the list. That said, probably the InteractionInputModule hides the standard one (that processes keyboard and mouse input). I’ll try to reproduce & research this issue during the weekend, and look for a solution. Please email me next week to check, if I have found a solution or workaround.

  12. Hello Mr.Rumen!

    I’m making a senior project with kinect in unity about AR fitting room, I have a kinect v2. Could you please have any suggestions about using your package with my project? Thanks for noticing this. Sorry for my bad English.

    Best Regards,
    Pakkanan Satha

  13. Hello Rumen,

    I am trying to get the audio source angle from Kinect. How could I implement it? I couldn’t find a public KinectSensor variable in KinectManager.

    Thank you!

    • Hi Jing, you can get the Kinect-v2 SDK-specific KinectSensor like this: ‘Windows.Kinect.KinectSensor kinectSensor = (KinectManager.Instance.GetSensorData().sensorInterface as Kinect2Interface).kinectSensor;’

      But there were some little tricks, when you need to get the audio beam angle from the audio source, as far as I remember. If you get stuck, please e-mail me and I’ll try to find the audio-beam tracking class I have written some time ago.

      • Hi Rumen,

        I am also trying to use the speech manager. Microsoft suggests — for long recognition sessions, that the SpeechRecognitionEngine be recycled (destroyed and recreated) periodically, say every 2 minutes based on your resource constraints. I also want to turn off the AdaptationOn flag in UpdateRecognizerSetting. How could I do those?

        Thank you!

      • To recycle the speech manager, I suppose you need to put it in a prefab, then instantiate it and destroy the instance from time to time.

        Regarding the adaptation-on/off flag, see the last parameter of sensorData.sensorInterface.InitSpeechRecognition()-invocation in the Start()-method of SpeechManager.

      • I put SpeechManager in a prefab, instantiate a clone in Start() function, then destroy the clone and instantiate a new one every 60 seconds in Update() function. It seems that the clone are destroyed and recreated. However, the speech Manager script only run once. After 60s, I could create a clone but no speech recognition functionality. Do I need to specifically tell the script to start running?

    • It’s a matter of layers rendered by different cameras, as to me. See the 2nd background removal demo. It shows background + BR-foreground image + 3d objects rendered behind and in front of the BR-image.

      • I open the KinectBackgroundRemoval1 ,but I can’t see mine , and unity did’n print any error,why?

      • 1.log:K2-sensor opened, available: True
        2.log:Interface used: Kinect2Interface
        3.log:Shader level: 50
        4.log:Waiting for users.
        5.log:Adding user 0, ID: 72057594037941507, Body: 4

        but the demo(background remove) did’n work, why?

    • Model asset means the fbx-file of your rigged model, placed somewhere under the Assets-folder of your Unity project. It should be placed there, in order to be visible by the Unity editor.

  14. Hello ,
    like you mentioned , , i am currently doing my bachelor thesis , and my project is based on kinect body tracking so i am urgently need this package to enable me to start my project and start get the data from it , but i can not find your e-mail to contact you , so can you help me in this please .
    Thank you .

  15. Hello, Rumen. Fantastic Work. Realy helpful.

    Is it posible to customize grip event? I mean, HandState = Grip when hand is 100% closed. Is it possible customize that and detect HandState = Grip when hand is 70% closed for example? I don’t find the class to do that in the project.

    Thanks in advance

    • Hi, the Kinect SDK provides only 3 discrete hand states – Release, Grip and Lasso. In this regard, there is no percentage available. These states are processed as hand interactions by the InteractionManager (script component of the KinectController-game object) in the demo scenes.

  16. Is it possible to use your package to create Xbox One app in the Unity and then run the app on the Xbox One device with connected kinect 2 ?

  17. Hello,

    Is it possible to run your examples in the Xbox One app made in Unity?
    So that I export the app in Unity as an Xbox One app and then run the example on the actual Xbox One device with kinect connected.


    • Hi Aldo, I’m not sure if it is the same, but take a look at the 3rd and 4th background-removal demo scenes. They do something similar, I think.

  18. Hi Rumen,

    I purchased this asset on the assetstore and it worked great for me.

    I have a question regarding “seated” tracking mode, I have an app where the lower half of the body of the user (legs and torso) is not visible to the Kinect, and according to this: , seated mode is more suitable for my case.

    I tried current tracking mode and it seems fine when parts of the legs are in range, but once legs are out of range everything becomes a mess, even hands, which are the only thing I really want to track.

    However, it seems that this functionality isn’t accessible via this asset as of now, is there a workaround so I can access “kinect.SkeletonStream.TrackingMode”?

    Best Regards,

    • Hi, there is no more seated mode in the Kinect SDK 2.0. It is Kinect-v1 specific mode, as far as I now. But even then I never utilized this mode in my Unity assets, because it changes the body hierarchy.

      I would recommend to use the AvatarControllerClassic-component instead of the AvatarController. Then assign to its settings only the joints belonging to the upper part of the avatar’s body. See the 3rd avatar-demo, if you need an example.

      If this solution is not enough, feel free to e-mail me your invoice number and paypal account, and I’ll refund you the money.

      • Thanks for the reply,

        I didn’t notice that seated mode is v1 only, I will try the AvatarControllerClassic demo and see if it gives better results, but i don’t think it will because the problem is apparent in user map display even when having no avatar controllers at all, that’s why I sought a different algorithm and thought seated mode is the solution.

        I will try a few different things, maybe we can redesign the project so it can rely on gestures or something.

        And don’t worry about the refund, I already used it on a different project and it worked fine.

        Thanks again and best regards,

  19. hello rumen!
    i am new to kinect and unity! i ask you something that for control of avatar in unity it is necessary to buy this asset? i am student it is expensive for me to purchase this asset. is there another solution for control avatar in unity with kinectv2?

  20. Hi Rumen,

    I have an issue with the kinect manager script “player calibration pose”. Setting it to anything other than none, my model wouldn’t appear. How do i set it to calibrate via the t pose? Thanks!

    • Hi, make sure the KinectGestures-component is added to the KinectController-object in the scene, as well. The processing of all poses and gestures is in there.

  21. Hi Rumen
    What could be done so that the avatar’s hands do not go through his body?. We have seen that in your examples there are some Colliders, but we do not know if they are aimed at avoiding this situation.

    • Hi Mario, the current colliders were more aimed to prevent going through other objects, but I think preventing hands going through the body should apply something similar. One other option would be to make proper muscle adjustments in the avatar definitions, and then enable ‘Apply muscle limits’-setting of the AvatarController. Please experiment a bit.

  22. Hi Rumen,

    First, I’d like to thank you for a great asset! Works wonderfully well except for one little problem which I hope you can help me resolve.

    I’ve been using an older version of the package for nearly a year now and only recently have I updated it to the latest version. After the update, I was not able to move the position of the avatar so I compared the avatarcontroller script between the two versions and I noticed that you’re now setting the global position of the gameobject instead of the local position. Is there a way for me to manipulate the avatar’s position globally like how it was previously possible?

    Thanks in advance,

    • Hi Sid, why don’t you just copy back the older version of AvatarController, in case you find it more useful? I suppose you would need to fix some syntax errors in the process (like method invocation parameters, etc.), but it should be possible. In this regard, feel free to email me, if you have questions or get stuck anywhere in the process.

  23. Hola soy estudiante del Instituto Metropolitano de Diseño en Ecuador y estoy realizando mi tesis de grado para la cual necesito el Asset (Kinect v2 Examples with MS-SDK) y quisiera saber si tu me lo podrías ayudar ya que el proyecto no sera comercializado ni replicado te agradezco de antemano por tu pronta respuesta.

    • Sorry, my Spanish is not good, at all. If you want to get a student’s copy of the K2-asset, please send me you your e-mail request FROM your university e-mail address.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s