Kinect v2 Examples with MS-SDK and Nuitrack SDK

Kinect2_MsSDKKinect v2 Examples with MS-SDK and Nuitrack SDK is a set of Kinect-v2 (aka ‘Kinect for Xbox One’) examples that use several major scripts, grouped in one folder. The package contains over thirty demo scenes.

Apart of the Kinect-v2 and v1 sensors, the K2-package supports Intel’s RealSense D400-series, as well as Orbbec Astra & Astra-Pro sensors via the Nuitrack body tracking SDK. Please mind that Nuitrack SDK is in early stage, and issues are possible.

The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture-demos – how to use the programmatic or visual gestures, fitting room demos – how to create your own dressing room, overlay demos – how to overlay body parts with virtual objects, etc. You can find short descriptions of all demo-scenes in the K2-asset online documentation.

This package works with Kinect-v2, Kinect-v1 (aka Kinect for Xbox One & Kinect for Xbox 360), Intel RealSense D415 & D435, Orbbec Astra & Astra-Pro, and other Nuitrack supported sensors, as well. It can be used in all versions of Unity – Free, Plus & Pro.

Free for education:
The package is free for academic use. If you are a student, lecturer or academic researcher, please e-mail me to get the K2-asset directly from me.

Free copy of ‘Kinect Mocap Animator’:
As of 17.Aug.2018, I provide the ‘Kinect Mocap Animator’-package free of charge for personal use to anyone, who requests it by e-mail and has a valid invoice for the K2-asset.

One request:
My only request is NOT to share this package or its demo scenes in source form with others, or as part of public repositories, without my explicit consent. Please respect my work.

Customer support:
* First, please check if you can find the answer you’re looking for on the Tips, tricks and examples page, as well as on K2-asset Online Documentation page. See also the comments below the articles here, or in the Unity forums. If it is not there, you may contact me, but please don’t do it on weekends or holidays. This is the only time I can have some rest.
* If you e-mail me, please include your invoice number. I provide free e-mail support for the first 3 months after the product purchase (limited to max 10 issues). More information regarding e-mail support can be found here.
* Please mind, you can always update your K2-asset free of charge, from the Unity Asset store.

How to run the demo scenes:
1. (Kinect-v2) Download and install Kinect for Windows SDK v2.0. The download link is below.
2. (Kinect-v2) If you want to use Kinect speech recognition, download and install the Speech Platform Runtime, as well as EN-US plus the other needed language packs. The download links are below.
3. (Nuitrack or Orbbec) If you want to work with Nuitrack body tracking SDK, look at this tip. If you want to work with Orbbec Astra or Astra-Pro sensors via OpenNI2, look at this tip.
4. Import this package into new Unity project.
5. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’.
6. Make sure that ‘Direct3D11’ is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
7. Open and run a demo scene of your choice, from a subfolder of ‘K2Examples/KinectDemos’-folder. The short descriptions of all demo scenes are available here.

* Kinect for Windows SDK 2.0 (Windows-only) can be found here.
* MS Speech Platform Runtime v11 can be downloaded here. Please install both x86 and x64 versions, to be on the safe side.
* Kinect for Windows SDK 2.0 language packs can be downloaded here. The language codes are listed here.

* The official online documentation of the K2-asset is available here.

* The official release of the ‘Kinect v2 with MS-SDK’ is available at Unity Asset Store. All updates are free of charge.

* If you get compilation errors like “Type `System.IO.FileInfo’ does not contain a definition for `Length’”, you need to set the build platform to ‘Windows standalone’. For more information look at this tip.
* If the Unity editor crashes, when you start demo-scenes with face-tracking components, look at this workaround tip.
* If the demo scene reports errors or remains in ‘Waiting for users’-state, make sure you have installed Kinect SDK 2.0, as well as the other needed components, and the sensor is connected.

* Here is a link to the project’s Unity forum:
* Many Kinect-related tips, tricks and examples are available here.
* The official online documentation of the K2-asset is available here.

Known Issues:
* If you get compilation errors, like “Type `System.IO.FileInfo’ does not contain a definition for `Length“, open the project’s Build-Settings (menu File / Build Settings) and make sure that ‘PC, Mac & Linux Standalone’ is selected as ‘Platform’ on the left. On the right side, ‘Windows’ must be the ‘Target platform’, and ‘x86’ or ‘x86_64’ – the ‘Architecture’. If Windows Standalone was not the default platform, do these changes and then click on ‘Switch Platform’-button, to set the new build platform. If the Windows Standalone platform is not installed at all, run UnityDownloadAssistant again, then select and install the ‘Windows Build Support’-component.
* If you experience Unity crashes, when you start the Avatar-demos, Face-tracking-demos or Fitting-room-demos, this is probably due to a known bug in the Kinect face-tracking subsystem. In this case, first try to update the NVidia drivers on your machine to their latest version from NVidia website. If this doesn’t help and the Unity editor still crashes, please disable or remove the FacetrackingManager-component of KinectController-game object. This will provide a quick workaround for many demo-scenes. The Face-tracking component and demos will still not work, of course.
* Unity 5.1.0 and 5.1.1 introduced an issue, which causes some of the shaders in the K2-asset to stop working. They worked fine in 5.0.0 and 5.0.1 though. The issue is best visible, if you run the background-removal-demo. In case you use Unity 5.1.0 or 5.1.1, the scene doesn’t show any users over the background image. The workaround is to update to Unity 5.1.2 or later. The shader issue was fixed there.
* If you update an existing project to K2-asset v2.18 or later, you may get various syntax errors in the console, like this one: “error CS1502: The best overloaded method match for `Microsoft.Kinect.VisualGestureBuilder.VisualGestureBuilderFrameSource.Create(Windows.Kinect.KinectSensor, ulong)’ has some invalid arguments”. This may be caused by the moved ‘Standard Assets’-folder from Assets-folder to Assets/K2Examples-folder, due to the latest Asset store requirements. The workaround is to delete the ‘Assets/Standard Assets’-folder. Be careful though. ‘Assets/Standard Assets’-folder may contain scripts and files from other imported packages, too. In this case, see what files and folders the ‘K2Examples/Standard Assets’-folder contains, and delete only those files and folders from ‘Assets/Standard Assets’, to prevent duplications.
* If you want to release a Windows 32 build of your project, please download this library (for Kinect-v2) and/or this library (for Kinect-v1), and put them into K2Examples/Resources-folder of your project. These 32-bit libraries are stripped out of the latest releases of K2-asset, in order to reduce its package size.

What’s New in Version 2.18.x:
1. Added ‘Body blur filter’-setting to the Background-removal-manager, to allow finer control over body silhouette blurring in the process of background removal.
2. Added ‘Erode iterations 0’-setting to the Background-removal-manager, to allow initial erode of the body silhouette, before the dilate-filter.
3. Added ‘Body contour color’-setting to the Background-removal-manager, to allow the color of the resulting body contour to be adjusted for the target environment (thanks to N. Naydenov).
4. Added new gesture-poses – TouchRightElbow & TouchLeftElbow (thanks to N. Naydenov).
5. Updated Model-selector-component to allow single dressing menu to be reused by multiple model selectors in the Fitting-room-demo1 scene (thanks to Arhan).
6. Updated KinectManager to check for existing KinectGestures-component, when there are gesture listeners in the scene.
7. Moved Plugins & Standard-Assets-folders to the K2Examples-folder, to meet the Asset-store requirement for a single-folder asset.
8. (2.18.1) Updated arm and leg orientation estimation, to improve the clothing-model poses in fitting room demo scenes (thanks to Alay M).
9. (2.18.1) Updated avatar scaler to avoid model scaling, when the user is too close (thanks to r618).

Videos worth more than 1000 words:
Here is a video by Ricardo Salazar, created with Unity5, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3:

..a video by Ranek Runthal, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3

..and a video by Brandon Tay, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.0:

703 thoughts on “Kinect v2 Examples with MS-SDK and Nuitrack SDK

  1. Hi Rumen: I now get about 10 errors: Assets/KinectScripts/AvatarController.cs(12,14): error CS0101: The namespace `global::’ already contains a definition for `AvatarController’
    such as these. I think I followed your directions pretty well. Any ideas?

    • New
      Hi, I just purchased this. I got this error when trying to run the sample scene.

      Assets/K2Examples/KinectScripts/Interfaces/NuitrackInterface.cs(529,23): error CS0246: The type or namespace name `nuitrack’ could not be found. Are you missing an assembly reference?


      • Hi, what platform is selected in your Build Settings? And, are using the latest version (v2.17.3) of the K2-asset or an older one?

      • Yes, I’m using the current version (v2.17.3). How can i access the previous version?
        My target platform is Windows and i tried both x86 and x86_64 and got the same error

  2. hi ,
    Loving the asset
    I am working in untiy 2017.3 version.
    i have an issue in “KinectFittingRoom1” demo scene . there is color image turn opposite side.

    i am changing the shader “userBlendedshader” to “ForegroundBlendedShader” but the depth is gone ..

    please help me !!!

    and also Background removal scene not working

    • Hi, I just checked both FittingRoomDemo1 & BackgroundRemovalDemo1 with Kinect-v2 in Unity 2017.3 editor, and it worked as expected. Are you using Kinect-v2 or some other sensor?

      Please contact me by e-mail and send me some screenshots, and the Unity log-file as well/ This way I could see what happens at your end. Please mention your invoice number, too. Here is how to find the Unity log-files on your platform:

      • Ok, already answered. Please check the version of the K2-asset you’re using – see ‘WhatsNew-Kinect2-MsSdk.pdf’ in the _Readme-folder. I think your copy is quite old, because similar issues were fixed already, in the more recent releases.

    • Assets/K2Examples/KinectScripts/Interfaces/NuitrackInterface.cs(529,23): error CS0246: The type or namespace name `nuitrack’ could not be found. Are you missing an assembly reference?

      • Hi, what platform is selected in your Build Settings? And, are using the latest version (v2.17.3) of the K2-asset or an older one?

      • If you are using Kinect-v2 or Kinect-v1, the selected platform should be ‘PC, Mac & Linux Standalone’ with target platform ‘Windows’ and architecture ‘x86_64’. This should fix the errors.

      • To remove the Nuitrack-interface, delete KinectScripts/Interfaces/NuitrackInterface.cs, the KinectScripts/Interfaces/Nuitrack-folder and comment out the references to NuitrackInterface in KinectScripts/KinectInterop.cs.

      • I tried this

        To remove the Nuitrack-interface, delete KinectScripts/Interfaces/NuitrackInterface.cs, the KinectScripts/Interfaces/Nuitrack-folder and comment out the references to NuitrackInterface in KinectScripts/KinectInterop.cs.

        No errors but demo scenes not working.
        please note that i am working with kinect V2

      • There is something generally wrong in your setup. I don’t remember anyone else so far having similar problems with Kinect-v2 on Windows standalone platform. I suppose you have installed the Kinect SDK 2.0, as per setup instructions.

        In this regard, may I ask you to e-mail me the Unity log file after running and stopping the scene, so I can see what happens. Here is where to find it: Please also mention your invoice number in the e-mail.

  3. Hi, is it possible for an ordinary camera (webcam) for motion detection?

    Another question, On your scene KinectFaceTrackingDemo3. Is it possible to get exact data of face only? I mean we just need to get background removal of face and the neck and we want to remove the quad boundary being seen on background removal.

    • The motion detection is done by segmenting users from the depth camera image, as far as I remember. If you want to use external color camera along with it, you would need to calibrate the depth camera and the new color camera, to get correct color image overlays.

      To the 2nd questions: See the source of the SetFaceTexture-component in the scene. It extracts the face texture and applies it to the given target object (in this case – the Quad). You can modify the code, to apply the texture to an object of your choice. Hope I understood your issues correctly.

      • Thanks Rumen! As of now I am trying to work with SetFaceTexture quad component for other rigged models (combination of avateering + setfacetexture for face but i only want to expose the head). But I cant figure out about the the SetFaceTexture error saying that:

        Texture has out of range width / height
        UnityEngine.Texture2D:Resize(Int32, Int32)
        SetFaceTexture:Update() (at Assets/K2Examples/KinectDemos/FaceTrackingDemo/Scripts/SetFaceTexture.cs:166)

        I tried to copy the properties of the quad from original KinectFaceTrackingDemo3 demo but it seems it doesnt work. I made the quad (FaceView based from your example) as a child of the neck portion of the rigged model. Can you tell me about the cause of error?

        PS: Didnt modify anything on SetFaceTexture script.


      • The face texture uses the face part (determined by face-width & face-height parameters, around the head) of the color image or the background-removed image (called foreground texture). That’s why the BackgroundRemovalManager-component is in the scene, too. If you look above line 166 in the script, there are some checks if the face rectangle is not beyond the limits of the foreground or color-camera textures (they have equal resolutions). Maybe this check estimates negative values for width or height of the face texture sometimes. It’s easy to debug and find out, I think. There is nothing special about this script, really 😉

  4. Hi,
    I am using “Orbbec Astra Pro” And “NUI Track” now.
    but Color is not displayed in “KinectOverlayDemo 1 scene”.

    Can you tell me how to use Color stream with “Kinect v2 Examples with MS-SDK and Nuitrack SDK” in “Orbbec Astra Pro”?

    This sentence uses Google Translate.

    thank you.

    • This is one of the fixes in the upcoming v1.17.2 of the K2-asset. I submitted it yesterday to Unity asset store, but it is not yet published. If you want to get it earlier, please e-mail me your request, along with the invoice number you got from the Asset store.

      • Hi,
        Kinect v2 Examples with MS-SDK and Nuitrack SDK
        In Current Ver 2.17.2

        I confirmed that “Orbbec Astra Pro” And “NUI Track” works properly with “KinectOverlayDemo 1 scene”.

        This sentence is using Google translation.

        It was saved.

        Thank you

  5. Hello Rumen,

    I built and ran the projects and to my folly, I tried to see if the Orbbec Astra would still work (detect skeletons) if several of those apps ran in the background. The demos didn’t crash but they didn’t work either, and now even when I reset everything, the whole thing has stopped working. I’m not using the trial version of Nuitrack also. Any thoughts on how I can fix this..?

  6. Hello! The package is absolutely fantastic. So much stuff for 20$. My goal is to track a person that is lying down on his back. It doesn’t work because of the depth difference is too small between the body and the table on which the person is lying on I guess. Do you have any idea on which settings I should tune so that it can work, if possible at all? Thanks for everything!


    • The Kinect’s face-tracking subsystem tracks the positions of the eyes. See KinectFaceTrackingDemo5-scene in KinectDemos/FaceTrackingDemo-folder. But if under eye-tracking you mean tracking the direction the user is looking, this is not supported. In this case, better use the forward-direction of the head.

  7. Hi Rumen, I use your Asset with a Kinect v2 atm. Apart from installing the SDKs and following your tips, do I have to make any changes to my Unity Project to get it to work with an Orbbec device? Cheers.

  8. Hi Rumen, Is it possible to create fake body that can be detected by kinect as user? Or any idea how can I accomplish this? I will use this for testing purposes. Thanks.

    • Hi, for testing purposes you can make recordings with the Kinect Studio (part of Kinect SDK 2.0) and then play them back, after you run the scene.

  9. Hello! i have used this asset for making body tracking system.
    I change my Kinect V2 into Intel Realsense D415(with Nuitrack).
    And i play the KinectAvatarsDemo1 included in this asset.
    Of course it doesn’t work. Demo Scene used kinect’s library and abandoned others.
    So i remove the kinect’s library in my computer.
    But demo couldn’t find libnuitrack dll file, and DummyK2Interface is linked
    although i’ve never installed regarding library.

    How can i play demo scene with nuitrack? I’ve already installed nuitrack SDK.

  10. Hi, I’m working Realsense D435 with kincet v2 package. I change “sensorAlwaysAvailable” from “true” to “false” when I run scene AvatartDemo. I see message “No sensor found. Make sure you have installed the SDK and the sensor is connected.” . But i setting SDK Realsense. Please help me. Thanks…

  11. Hello Rumen, happens when moving from one scene to another gets slow, cursor or hand is put difficult to grasp objects, I would like if I can recommend something to improve and make more fast my game

    • See ‘Howto-Use-KinectManager-Across-Multiple-Scenes.pdf’ located in K2Example/_Readme-folder in the K2-asset, as well as its implementation in KinectDemos/MultiSceneDemo. To run this demo, you need to put the scenes (in the same order) into the Scenes-in-Build list of Build settings.

      I suppose you have KinectManager-components in each scene, but there should be only one – in the startup scene. See the demo.

  12. Hello Rumen, we are using your Kinect v2 plugin for unity and we face a problem, after attaching avatar controller component to our own model when testing with kinect shoulders drop and deform the mesh. We tried different rig variations. But so far the only solution we got was to use avatar controller classic, however if using it there are jitters with model parts like feet and arms. Is there anything you can recommend us to do?

    • I suppose you mean the 1st fitting-room demo, right? If this is the case, first enable the BackgroundRemovalManager-component of KinectController in the scene, and then simply update the texture of the RawTexture-component of BackgroundImage1 from your script, as needed.

  13. Hi Rumen, my orbbec astra pro works well on Unity with targetting PC platform,
    but doesn’t with targetting Android. do u have any idea?
    i have to develop at android because client required me to make it possible to run project in orbbec persee.

    • Hi, I don’t have experience with Persee, so can’t help you much with it. As to running on Android, I think you need to use the Nuitrack SDK. As far as I remember Nuitrack supported the Android platform, and the K2-asset has Nuitrack interface, as well.

      • I already know about Nuitrack SDK and your asset runs successfully with Android OS,
        i just wonder it is possible to run in Unity Editor with platform switched into android.
        As you already know, we can’t use Nuitrack SDK because PC uses Window.
        But this issue is not about your asset, but develop environment.
        I’ll try to solve it by myself. Thx for your advice.

  14. Hi Rumen
    Im a teacher at a University and Im trying to use a Kinect v2 on Unity for an educational purpose. Im reading your post and looks very interesting, Im looking for free resources might be you can help

    thank you in advance

  15. Hi Rumen!

    I have to develop some depth check and object rendering program.
    first, i find the point where some kind of walls and stones or obstacles place using only depth value.
    second, in color frame , render some color or object at the point where stones or obstacles place.

    Do you have any recommendation for me?
    any algorithm, tutorial in your assets, etc.

    • Additionally, i have to render silhouettes of tracked users and
      circles at the point where hands and feet of users place.
      If u have an idea, plz give me an advice.

      • Finally, NuitrackInterface throws Log ‘Depth sensor: nuitrack.NativeDepthSensor, width: 640, height: 480’.
        But depth stream resolution of realsense d415 is 1280×720.
        How can i get depthImageBufferData of 1280×720?
        So does ColorFrame. rgb stream resolution of realsense d415 is 1920×1080,
        but NuitrackInterface throws me a 640×480 colorImageBufferData.

    • I’m not sure about the details of your setup, but I suppose the “normal” situation is not to have any obstacles in the view, in means of walls and stones. In this regard, you could check for depth points within some range (left-right and up-down to the camera), and if there are points with depth values less than (let’s say) 4 meters, consider them as obstacle and render them accordingly.

      One other option would be to use some object detection algorithm (for instance deep learning or OpenCV object detection) and use it to look for objects.

      • Thx for ur reply.I’ll try the first of your solutions.
        Your assets and knowledge, and your FAVOR helps me a lot.

  16. Hi Rumen,
    i am a student from XJTLU (Xi’an Jiaotong-Liverpool University ), i am working for a project about exploring the suitability of exergames for virtual reality head-mounted displays.I noticed that your work will be of great help to my research. Would you like to give me a K2 to let me complete my research? (I promise not to use him for any commercial purpose.)
    Zeying Zhang

  17. Hi Rumen,
    Great asset, always a pleasure to use!
    I’ve moved over to Realsense 415; thanks to your tip, have gotten it working in Unity with your package.
    My inquiry is about background removal / RGB Alignment. As you mentioned, they do not currently work with realsense. Is there any advice you can give to implement my own version, or fix the existing scenes? \

    • Hi, the background removal should work with Realsense 415, as well. It is not as good, as with Kinect-v2, but it should work. One option to explore is to set ‘Depth2ColorRegistration’ in /data/nuitrack.config to ‘true’ and then try again. Please also upgrade to the latest version of K2-asset, if you are using an older release.

  18. Hi Rumen!
    I followed your previous advice using OpenCV object detection algorithm,
    but OpenCV object detection hugely depends on light and shadow,
    so i couldn’t get enough product.

    So i’m trying to get depthMap and farthest depth value(which means wall),
    and I’ll regard all points whose value is less than wall’s value as obstacles.

    First step, i want to render depth map to RawImage Texture,
    but in KinectManager class, i can render color texture with GetUserClrTex() method
    but i can’t find one for depth texture. Is there any method for depth Texture?

    • And can i get real distance between camera and each point?
      For comparing i would use real distance than respective depth value.

    • Hi, I meant you need to use the depth image and process it with some OpenCV algorithm, to find the clusters of depth points and their respective centers. But your simplified approach sounds reasonable, as well.

      Regarding your question: First set the ‘Compute user map’-setting of KinectManager to ‘User texture’. Then open DepthShader.shader in K2Examples/Resources-folder and uncomment the commented out code (the else-part) near the end of the shader. This will allow the shader to render the full depth picture, not only the detected users. Finally, you can get the currently rendered depth texture by calling KinectManager.Instance.GetUsersLblTex() or GetUsersLblTex2D(), if you need a Texture2D-texture.

      • Thank for your reply 🙂
        I have a last question.
        I’m really sorry to bother you.

        Is it right that GetRawDepthMap() method in KinectManager class returns
        real distance between sensor and each point in ushort array?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s