Kinect v2 Examples with MS-SDK

Kinect2_MsSDKKinect v2 Examples with MS-SDK is a set of Kinect-v2 (aka ‘Kinect for Xbox One’) examples that use several major scripts, grouped in one folder. The package contains over thirty demo scenes.

Please look at the Azure Kinect Examples for Unity asset, as well. It works with the Azure Kinect (aka Kinect-for-Azure, K4A), as well as with Kinect-v2 (aka Kinect-for-Xbox-One) and RealSense D400 series sensors.

Apart of the Kinect-v2 and v1 sensors, the K2-package supports Intel’s RealSense D400-series, as well as Orbbec Astra & Astra-Pro sensors via the Nuitrack body tracking SDK.

The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture-demos – how to use the programmatic or visual gestures, fitting room demos – how to create your own dressing room, overlay demos – how to overlay body parts with virtual objects, etc. You can find short descriptions of all demo-scenes in the K2-asset online documentation.

This package can work with Kinect-v2 (aka Kinect for Xbox One), Kinect-v1 (aka Kinect for Xbox 360), as well as with Intel RealSense D415 & D435, Orbbec Astra & Astra-Pro via the Nuitrack body tracking SDK. It can be used in all versions of Unity – Free, Plus & Pro.

Free for education:
The package is free for academic use. If you are a student, lecturer or academic researcher, please e-mail me to get the K2-asset directly from me.

One request:
My only request is NOT to share this package or its demo scenes in source form with others, or as part of public repositories, without my explicit consent.

Customer support:
* First, please check if you can find the answer you’re looking for on the Tips, tricks and examples page, as well as on K2-asset Online Documentation page. See also the comments below the articles here, or in the Unity forums. If it is not there, you may contact me, but please don’t do it on weekends or holidays.
* If you e-mail me, please include your invoice number. More information regarding e-mail support can be found here.
* Please mind, you can always update your K2-asset free of charge, from the Unity Asset store.

How to run the demo scenes:
1. (Kinect-v2) Download and install Kinect for Windows SDK v2.0. The download link is below.
2. (Kinect-v2) If you want to use Kinect speech recognition, download and install the Speech Platform Runtime, as well as EN-US (and other needed) language packs. The download links are below.
3. (Nuitrack) If you want to work with Nuitrack body tracking SDK, please look at this tip.
4. Import this package into new Unity project.
5. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’. The target platform should be ‘Windows’ and architecture – ‘x86_64’.
6. Make sure that ‘Direct3D11’ is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
7. Open and run a demo scene of your choice, from a subfolder of ‘K2Examples/KinectDemos’-folder. Short descriptions of all demo scenes are available here.

* Kinect for Windows SDK v2.0 (Windows-only) can be found here.
* MS Speech Platform Runtime v11 can be downloaded here. Please install both x86 and x64 versions, to be on the safe side.
* Kinect for Windows SDK 2.0 language packs can be downloaded here. The language codes are listed here.

* The online documentation of the K2-asset is available here and as a pdf-file, as well.

* The official release of the ‘Kinect v2 with MS-SDK’ is available at Unity Asset Store. All updates are free of charge.

* If you get errors like ‘Texture2D’ does not contain a definition for ‘LoadImage’ or ‘Texture2D’ does not contain a definition for ‘EncodeToJPG’, please open the Package Maneger, select ‘Built-in packages’ and enable ‘Image conversion’ and ‘Physics 2D’ packages.
* If you get compilation errors like “Type `System.IO.FileInfo’ does not contain a definition for `Length’”, you need to set the build platform to ‘Windows standalone’. For more information look at this tip.
* If the Unity editor crashes, when you start demo-scenes with face-tracking components, please look at this workaround tip.
* If the demo scene reports errors or remains in ‘Waiting for users’-state, make sure you have installed Kinect SDK 2.0, the other needed components, and check if the sensor is connected.

* Here is a link to the project’s Unity forum:
* Many Kinect-related tips, tricks and examples are available here.
* The official online documentation of the K2-asset is available here.

Known Issues:
* If you get compilation errors, like “Type `System.IO.FileInfo’ does not contain a definition for `Length“, open the project’s Build-Settings (menu File / Build Settings) and make sure that ‘PC, Mac & Linux Standalone’ is selected as ‘Platform’ on the left. On the right side, ‘Windows’ must be the ‘Target platform’, and ‘x86’ or ‘x86_64’ – the ‘Architecture’. If Windows Standalone was not the default platform, do these changes and then click on ‘Switch Platform’-button, to set the new build platform. If the Windows Standalone platform is not installed at all, run UnityDownloadAssistant again, then select and install the ‘Windows Build Support’-component.
* If you experience Unity crashes, when you start the Avatar-demos, Face-tracking-demos or Fitting-room-demos, this is probably due to a known bug in the Kinect face-tracking subsystem. In this case, first try to update the NVidia drivers on your machine to their latest version from NVidia website. If this doesn’t help and the Unity editor still crashes, please disable or remove the FacetrackingManager-component of KinectController-game object. This will provide a quick workaround for many demo-scenes. The Face-tracking component and demos will still not work, of course.
* Unity 5.1.0 and 5.1.1 introduced an issue, which causes some of the shaders in the K2-asset to stop working. They worked fine in 5.0.0 and 5.0.1 though. The issue is best visible, if you run the background-removal-demo. In case you use Unity 5.1.0 or 5.1.1, the scene doesn’t show any users over the background image. The workaround is to update to Unity 5.1.2 or later. The shader issue was fixed there.
* If you update an existing project to K2-asset v2.18 or later, you may get various syntax errors in the console, like this one: “error CS1502: The best overloaded method match for `Microsoft.Kinect.VisualGestureBuilder.VisualGestureBuilderFrameSource.Create(Windows.Kinect.KinectSensor, ulong)’ has some invalid arguments”. This may be caused by the moved ‘Standard Assets’-folder from Assets-folder to Assets/K2Examples-folder, due to the latest Asset store requirements. The workaround is to delete the ‘Assets/Standard Assets’-folder. Be careful though. ‘Assets/Standard Assets’-folder may contain scripts and files from other imported packages, too. In this case, see what files and folders the ‘K2Examples/Standard Assets’-folder contains, and delete only those files and folders from ‘Assets/Standard Assets’, to prevent duplications.
* If you want to release a Windows 32 build of your project, please download this library (for Kinect-v2) and/or this library (for Kinect-v1), and put them into K2Examples/Resources-folder of your project. These 32-bit libraries are stripped out of the latest releases of K2-asset, in order to reduce its package size.

What’s New in Version 2.21:
1. Added ‘Fixed step indices’ to the user detection orders, to allow user detection in adjacent areas.
2. Added ‘Central position’-setting to the KinectManager, to allow user detection by distance, according
to the given central position.
3. Added ‘Users face backwards’-setting to the KM, to ease the backward-facing setups.
4. Updated Nuitrack-sensor interface, to support the newer Nuitrack SDKs (thanks to Renjith P K).
5. Cosmetic changes in several scenes and components.

Videos worth more than 1000 words:
Here is a video by Ricardo Salazar, created with Unity5, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3:

..a video by Ranek Runthal, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3

..and a video by Brandon Tay, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.0:

787 thoughts on “Kinect v2 Examples with MS-SDK

  1. Hi Rumen: I now get about 10 errors: Assets/KinectScripts/AvatarController.cs(12,14): error CS0101: The namespace `global::’ already contains a definition for `AvatarController’
    such as these. I think I followed your directions pretty well. Any ideas?

    • New
      Hi, I just purchased this. I got this error when trying to run the sample scene.

      Assets/K2Examples/KinectScripts/Interfaces/NuitrackInterface.cs(529,23): error CS0246: The type or namespace name `nuitrack’ could not be found. Are you missing an assembly reference?


      • Hi, what platform is selected in your Build Settings? And, are using the latest version (v2.17.3) of the K2-asset or an older one?

      • Yes, I’m using the current version (v2.17.3). How can i access the previous version?
        My target platform is Windows and i tried both x86 and x86_64 and got the same error

  2. hi ,
    Loving the asset
    I am working in untiy 2017.3 version.
    i have an issue in “KinectFittingRoom1” demo scene . there is color image turn opposite side.

    i am changing the shader “userBlendedshader” to “ForegroundBlendedShader” but the depth is gone ..

    please help me !!!

    and also Background removal scene not working

    • Hi, I just checked both FittingRoomDemo1 & BackgroundRemovalDemo1 with Kinect-v2 in Unity 2017.3 editor, and it worked as expected. Are you using Kinect-v2 or some other sensor?

      Please contact me by e-mail and send me some screenshots, and the Unity log-file as well/ This way I could see what happens at your end. Please mention your invoice number, too. Here is how to find the Unity log-files on your platform:

      • Hi, I just send you email .. please check and find Attachment . i have attached screenshot …. please help me
        thanks for the reply

      • Ok, already answered. Please check the version of the K2-asset you’re using – see ‘WhatsNew-Kinect2-MsSdk.pdf’ in the _Readme-folder. I think your copy is quite old, because similar issues were fixed already, in the more recent releases.

    • Assets/K2Examples/KinectScripts/Interfaces/NuitrackInterface.cs(529,23): error CS0246: The type or namespace name `nuitrack’ could not be found. Are you missing an assembly reference?

      • Hi, what platform is selected in your Build Settings? And, are using the latest version (v2.17.3) of the K2-asset or an older one?

      • If you are using Kinect-v2 or Kinect-v1, the selected platform should be ‘PC, Mac & Linux Standalone’ with target platform ‘Windows’ and architecture ‘x86_64’. This should fix the errors.

      • To remove the Nuitrack-interface, delete KinectScripts/Interfaces/NuitrackInterface.cs, the KinectScripts/Interfaces/Nuitrack-folder and comment out the references to NuitrackInterface in KinectScripts/KinectInterop.cs.

      • I tried this

        To remove the Nuitrack-interface, delete KinectScripts/Interfaces/NuitrackInterface.cs, the KinectScripts/Interfaces/Nuitrack-folder and comment out the references to NuitrackInterface in KinectScripts/KinectInterop.cs.

        No errors but demo scenes not working.
        please note that i am working with kinect V2

      • There is something generally wrong in your setup. I don’t remember anyone else so far having similar problems with Kinect-v2 on Windows standalone platform. I suppose you have installed the Kinect SDK 2.0, as per setup instructions.

        In this regard, may I ask you to e-mail me the Unity log file after running and stopping the scene, so I can see what happens. Here is where to find it: Please also mention your invoice number in the e-mail.

      • I managed to fix it by copying the dll and pasting it in the kinectdemoscene.
        It works well


  3. Hi, is it possible for an ordinary camera (webcam) for motion detection?

    Another question, On your scene KinectFaceTrackingDemo3. Is it possible to get exact data of face only? I mean we just need to get background removal of face and the neck and we want to remove the quad boundary being seen on background removal.

    • The motion detection is done by segmenting users from the depth camera image, as far as I remember. If you want to use external color camera along with it, you would need to calibrate the depth camera and the new color camera, to get correct color image overlays.

      To the 2nd questions: See the source of the SetFaceTexture-component in the scene. It extracts the face texture and applies it to the given target object (in this case – the Quad). You can modify the code, to apply the texture to an object of your choice. Hope I understood your issues correctly.

      • Thanks Rumen! As of now I am trying to work with SetFaceTexture quad component for other rigged models (combination of avateering + setfacetexture for face but i only want to expose the head). But I cant figure out about the the SetFaceTexture error saying that:

        Texture has out of range width / height
        UnityEngine.Texture2D:Resize(Int32, Int32)
        SetFaceTexture:Update() (at Assets/K2Examples/KinectDemos/FaceTrackingDemo/Scripts/SetFaceTexture.cs:166)

        I tried to copy the properties of the quad from original KinectFaceTrackingDemo3 demo but it seems it doesnt work. I made the quad (FaceView based from your example) as a child of the neck portion of the rigged model. Can you tell me about the cause of error?

        PS: Didnt modify anything on SetFaceTexture script.


      • The face texture uses the face part (determined by face-width & face-height parameters, around the head) of the color image or the background-removed image (called foreground texture). That’s why the BackgroundRemovalManager-component is in the scene, too. If you look above line 166 in the script, there are some checks if the face rectangle is not beyond the limits of the foreground or color-camera textures (they have equal resolutions). Maybe this check estimates negative values for width or height of the face texture sometimes. It’s easy to debug and find out, I think. There is nothing special about this script, really 😉

  4. Hi,
    I am using “Orbbec Astra Pro” And “NUI Track” now.
    but Color is not displayed in “KinectOverlayDemo 1 scene”.

    Can you tell me how to use Color stream with “Kinect v2 Examples with MS-SDK and Nuitrack SDK” in “Orbbec Astra Pro”?

    This sentence uses Google Translate.

    thank you.

    • This is one of the fixes in the upcoming v1.17.2 of the K2-asset. I submitted it yesterday to Unity asset store, but it is not yet published. If you want to get it earlier, please e-mail me your request, along with the invoice number you got from the Asset store.

      • Hi,
        Kinect v2 Examples with MS-SDK and Nuitrack SDK
        In Current Ver 2.17.2

        I confirmed that “Orbbec Astra Pro” And “NUI Track” works properly with “KinectOverlayDemo 1 scene”.

        This sentence is using Google translation.

        It was saved.

        Thank you

  5. Hello Rumen,

    I built and ran the projects and to my folly, I tried to see if the Orbbec Astra would still work (detect skeletons) if several of those apps ran in the background. The demos didn’t crash but they didn’t work either, and now even when I reset everything, the whole thing has stopped working. I’m not using the trial version of Nuitrack also. Any thoughts on how I can fix this..?

  6. Hello! The package is absolutely fantastic. So much stuff for 20$. My goal is to track a person that is lying down on his back. It doesn’t work because of the depth difference is too small between the body and the table on which the person is lying on I guess. Do you have any idea on which settings I should tune so that it can work, if possible at all? Thanks for everything!


    • The Kinect’s face-tracking subsystem tracks the positions of the eyes. See KinectFaceTrackingDemo5-scene in KinectDemos/FaceTrackingDemo-folder. But if under eye-tracking you mean tracking the direction the user is looking, this is not supported. In this case, better use the forward-direction of the head.

  7. Hi Rumen, I use your Asset with a Kinect v2 atm. Apart from installing the SDKs and following your tips, do I have to make any changes to my Unity Project to get it to work with an Orbbec device? Cheers.

  8. Hi Rumen, Is it possible to create fake body that can be detected by kinect as user? Or any idea how can I accomplish this? I will use this for testing purposes. Thanks.

    • Hi, for testing purposes you can make recordings with the Kinect Studio (part of Kinect SDK 2.0) and then play them back, after you run the scene.

  9. Hello! i have used this asset for making body tracking system.
    I change my Kinect V2 into Intel Realsense D415(with Nuitrack).
    And i play the KinectAvatarsDemo1 included in this asset.
    Of course it doesn’t work. Demo Scene used kinect’s library and abandoned others.
    So i remove the kinect’s library in my computer.
    But demo couldn’t find libnuitrack dll file, and DummyK2Interface is linked
    although i’ve never installed regarding library.

    How can i play demo scene with nuitrack? I’ve already installed nuitrack SDK.

  10. Hi, I’m working Realsense D435 with kincet v2 package. I change “sensorAlwaysAvailable” from “true” to “false” when I run scene AvatartDemo. I see message “No sensor found. Make sure you have installed the SDK and the sensor is connected.” . But i setting SDK Realsense. Please help me. Thanks…

  11. Hello Rumen, happens when moving from one scene to another gets slow, cursor or hand is put difficult to grasp objects, I would like if I can recommend something to improve and make more fast my game

    • See ‘Howto-Use-KinectManager-Across-Multiple-Scenes.pdf’ located in K2Example/_Readme-folder in the K2-asset, as well as its implementation in KinectDemos/MultiSceneDemo. To run this demo, you need to put the scenes (in the same order) into the Scenes-in-Build list of Build settings.

      I suppose you have KinectManager-components in each scene, but there should be only one – in the startup scene. See the demo.

  12. Hello Rumen, we are using your Kinect v2 plugin for unity and we face a problem, after attaching avatar controller component to our own model when testing with kinect shoulders drop and deform the mesh. We tried different rig variations. But so far the only solution we got was to use avatar controller classic, however if using it there are jitters with model parts like feet and arms. Is there anything you can recommend us to do?

    • I suppose you mean the 1st fitting-room demo, right? If this is the case, first enable the BackgroundRemovalManager-component of KinectController in the scene, and then simply update the texture of the RawTexture-component of BackgroundImage1 from your script, as needed.

  13. Hi Rumen, my orbbec astra pro works well on Unity with targetting PC platform,
    but doesn’t with targetting Android. do u have any idea?
    i have to develop at android because client required me to make it possible to run project in orbbec persee.

    • Hi, I don’t have experience with Persee, so can’t help you much with it. As to running on Android, I think you need to use the Nuitrack SDK. As far as I remember Nuitrack supported the Android platform, and the K2-asset has Nuitrack interface, as well.

      • I already know about Nuitrack SDK and your asset runs successfully with Android OS,
        i just wonder it is possible to run in Unity Editor with platform switched into android.
        As you already know, we can’t use Nuitrack SDK because PC uses Window.
        But this issue is not about your asset, but develop environment.
        I’ll try to solve it by myself. Thx for your advice.

  14. Hi Rumen
    Im a teacher at a University and Im trying to use a Kinect v2 on Unity for an educational purpose. Im reading your post and looks very interesting, Im looking for free resources might be you can help

    thank you in advance

  15. Hi Rumen!

    I have to develop some depth check and object rendering program.
    first, i find the point where some kind of walls and stones or obstacles place using only depth value.
    second, in color frame , render some color or object at the point where stones or obstacles place.

    Do you have any recommendation for me?
    any algorithm, tutorial in your assets, etc.

    • Additionally, i have to render silhouettes of tracked users and
      circles at the point where hands and feet of users place.
      If u have an idea, plz give me an advice.

      • Finally, NuitrackInterface throws Log ‘Depth sensor: nuitrack.NativeDepthSensor, width: 640, height: 480’.
        But depth stream resolution of realsense d415 is 1280×720.
        How can i get depthImageBufferData of 1280×720?
        So does ColorFrame. rgb stream resolution of realsense d415 is 1920×1080,
        but NuitrackInterface throws me a 640×480 colorImageBufferData.

      • As far as I remember, the depth and color camera resolutions need to be set in the RealSense-section of ‘\data\nuitrack.config’.

    • I’m not sure about the details of your setup, but I suppose the “normal” situation is not to have any obstacles in the view, in means of walls and stones. In this regard, you could check for depth points within some range (left-right and up-down to the camera), and if there are points with depth values less than (let’s say) 4 meters, consider them as obstacle and render them accordingly.

      One other option would be to use some object detection algorithm (for instance deep learning or OpenCV object detection) and use it to look for objects.

      • Thx for ur reply.I’ll try the first of your solutions.
        Your assets and knowledge, and your FAVOR helps me a lot.

  16. Hi Rumen,
    i am a student from XJTLU (Xi’an Jiaotong-Liverpool University ), i am working for a project about exploring the suitability of exergames for virtual reality head-mounted displays.I noticed that your work will be of great help to my research. Would you like to give me a K2 to let me complete my research? (I promise not to use him for any commercial purpose.)
    Zeying Zhang

  17. Hi Rumen,
    Great asset, always a pleasure to use!
    I’ve moved over to Realsense 415; thanks to your tip, have gotten it working in Unity with your package.
    My inquiry is about background removal / RGB Alignment. As you mentioned, they do not currently work with realsense. Is there any advice you can give to implement my own version, or fix the existing scenes? \

    • Hi, the background removal should work with Realsense 415, as well. It is not as good, as with Kinect-v2, but it should work. One option to explore is to set ‘Depth2ColorRegistration’ in /data/nuitrack.config to ‘true’ and then try again. Please also upgrade to the latest version of K2-asset, if you are using an older release.

  18. Hi Rumen!
    I followed your previous advice using OpenCV object detection algorithm,
    but OpenCV object detection hugely depends on light and shadow,
    so i couldn’t get enough product.

    So i’m trying to get depthMap and farthest depth value(which means wall),
    and I’ll regard all points whose value is less than wall’s value as obstacles.

    First step, i want to render depth map to RawImage Texture,
    but in KinectManager class, i can render color texture with GetUserClrTex() method
    but i can’t find one for depth texture. Is there any method for depth Texture?

    • And can i get real distance between camera and each point?
      For comparing i would use real distance than respective depth value.

    • Hi, I meant you need to use the depth image and process it with some OpenCV algorithm, to find the clusters of depth points and their respective centers. But your simplified approach sounds reasonable, as well.

      Regarding your question: First set the ‘Compute user map’-setting of KinectManager to ‘User texture’. Then open DepthShader.shader in K2Examples/Resources-folder and uncomment the commented out code (the else-part) near the end of the shader. This will allow the shader to render the full depth picture, not only the detected users. Finally, you can get the currently rendered depth texture by calling KinectManager.Instance.GetUsersLblTex() or GetUsersLblTex2D(), if you need a Texture2D-texture.

      • Thank for your reply 🙂
        I have a last question.
        I’m really sorry to bother you.

        Is it right that GetRawDepthMap() method in KinectManager class returns
        real distance between sensor and each point in ushort array?

  19. Hi, we bought your asset and used it for some projects but now, I just wanted to know if it’s possible to use your package with multiple sensors? Like two Kinect or two Orbbec at the same time?

  20. Hi Rumen!

    We are having some trouble with a Orbbec Astra Pro. With a Kinect, we can successfully track 6 users, but with the Astra and Nuitrack, it seem we are limited to 2 users! Same project, same code.

    Did you already encounter that kind of bug?

  21. Kinect sdk v2.0.1409
    Unity 2018.3.0f2

    just installed package from asset store but give me errors about :
    ‘KinectVisualGestureBuilderSpecialCases.cs’ The name ‘_pNative’ does not exist
    ‘VisualGestureBuilderDatabase’ does not contain a constructor that takes 1 arguments
    ‘VisualGestureBuilderDatabase’ does not contain a definition for ‘AvailableGestures’
    ‘VisualGestureBuilderFrameSource’ does not contain a definition for ‘AddGesture’
    ‘VisualGestureBuilderFrameSource’ does not contain a definition for ”Dispose”
    ‘VisualGestureBuilderFrameSource’ does not contain a definition for ”TrackingId”
    … so on. About 15 errors.
    I tryed to add Kinect.2.0.1410.19000.unitypackage and Kinect.VisualGestureBuilder.2.0.1410.19000.unitypackage but did’t help

    • I also imported the K2-asset into a new Unity 2018.3.0f2 project and it didn’t produce any compilation errors. Please make sure you import the K2-asset into newly created Unity project, to avoid collisions with other, previously imported packages.

      • Sorry for bothering you, probably my Unity setup some how broken along the way. i’m getting weird errors on all my projects now. I’ll format my computer and try again when i get some free time.

      • If you have problems downloading the K2-asset from Unity asset store, please e-mail me your invoice number and I’ll send you the latest package via WeTransfer.

  22. Pingback: Kinect V2 Examples With MS-SDK And Nuitrack SDK - Free Download - Get Paid Unity Assets For Free

  23. Hi, Rumen
    I am using K2Asset very usefully.
    I heard that Kinect v4 was released.
    Do you have a plan to update K2Asset for Kinect v4 in the future? 🙂

  24. Hello Rumen,

    In avatar controller script, if posRelativeToCamera was set to a camera, it only works if the camera FOV was 60, while in reality the kinect v2 has 53.8 vertical FOV, so if I set the camera FOV to that to better match reality the character doesn’t appear in it’s right location anymore.

    is there a solution to this?

    • Hi, posRelativeToCamera set to ‘MainCamera’ means that the avatar position is estimated relative to the position of the MainCamera, in world coordinates. You can see this in the code of AvatarController.cs – line 965 and below. I’m not sure what the world coordinates have to do with the field of view of the camera, apart of some optical illusion. Or, maybe I didn’t understand your issue correctly.

  25. Hi I just bought your asset. I am a student and didn’t know I could get it free for academic use. The demos have been very informative. I am currently trying to walk in a scene with the kinect controlling my pov like rotating and movement. How do I do this?

    • Hi, you can e-mail me your invoice and some proof you are a student (for instance a picture of your student’s card), and I’ll refund your money. Please mind you will be no more eligible for e-mail support after that.

      To your question: Look at the demo scenes in KinectDemos/AvatarsDemo-folder (Hope I understood your question correctly). Here is a link to the documentation of the K2-asset scenes and components, as well:

  26. Hi Rumen,
    I’ve recently purchased this asset and everything’s working great so far with the Kinect! The only odd thing I’ve noticed is that when using Nuitrack, OpenNI and the Orbbec Astra, the Avatar’s feet are flailing uncontrollably. The legs etc. are tracked just fine, but the feet rotate for seemingly no reason. When using a Kinect v2 the feet are steady and everything works fine. Do you know how to deal with this?

    • Hi, I’m out of office this week, so please let me check this issue next week. As far as I remember, both Orbbec and Nuitrack don’t track the feet end points. I think this may be causing the issue, but let’s see. Please send me an e-mail next week to remind me about it, in case I forget.

    • I just tested the avatar demo with a RealSense D415 sensor and Nuitrack interface. The avatar’s feet look quite OK, as far as I see. Maybe I’m missing something. Could you please e-mail me some information on what to do, to reproduce the feet-issue. If possible, please include in the e-mail a short video or several screenshots depicting the problem, as well.

  27. Hi Rumen,
    I got my Orbbec Astra Pro working quite nice with Nuitrack and your great asset. However the RGB image shown in any demo is in 640×480 resolution when it should be 1280×720 according to the camera spec (it works as a webcam too and I can confirm that the webcam resolution is 1280×720). I read your related reply for Intel Realsense about the RGB resolution setting in nuitrack.config file, but could not find the similar setting for Astra Pro. Could you guide me a little on this? Thank you very much.

    • I can’t test this right now, but this is what I see in code. Please open KinectScripts/Interfaces/NuitrackInterface.cs and look for ‘colorWebCam = new WebCamTexture(WebCamTexture.devices[i].name, 640, 480, 30);’ Then try to change the resolution to 1280 x 720 instead. I don’t remember the frame rate, but I think it was 15. I’m not sure if this workaround will work, but is definitely worth trying. Experiment a bit, if needed.

  28. Hi Rumen, we are trying to create a debug mode for our Kinect app.

    Any way we can create a dummy skeleton for the joints to do a virtual dressing room test to fit the 3D model clothing?

    • There is a disabled SkeletonOverlayer-component of the KinectController-game object in KinectFittingRoom2-scene. You can enable it, to see how the Kinect-detected skeleton matches the model. You could also attach some primitives (spheres, cubes) to the joints of the model, as well, to make the joint match even more visible.

  29. Hi Rumen,
    It’s an amazing asset, thank you for your work. And the Azure Kinect DK of Microsoft will be on sale soon. Do you plan to develop a new Unity asset for the Azure Kinect DK?

  30. Hi Rumen,
    I am switching platform from WinPC to Android, using Nuitrack as a middleware.

    I found Nuitrack middleware works well with your asset on PC. Though when I switch OS to Android, it fails. I only get dummySDK(When checking GetSensorPlatform()), Not accessed to Nuitrack.

    Is there any way that I can force compile apk access Nuitrack as middleware, or some other recommendation you will like me to check.

    Hope this description is clear for your understanding, and many thanks for your effort and passion .

    I have successfully compile Nuitrack official asset from PC to Android, and it works well.
    Though I only want to user your asset as first priority transfer our project to Android OS.

    • Hi, please open KinectScripts/KinectInterop.cs and look at the SensorInterfaceOrder near its beginning. ‘NuitrackInterface()’ should be there, enabled for the current platform (Android). If it is enabled, look at the Unity logs on Android after you run the scene, to see if there are any errors when NuitrackInit() or the Create()-methods get called.

  31. Hi Rumen. Congrats. I bet you regret you made this, considering the amount of questions:)

    I am building an installation that pulls animations from a motion capture database. In the same project, I have your package (with a copy of the AvatarDemo) and Keijiro’s Smrvfx (which at the moment works like a charm) applied to our U_Char. The idea is that the user dances in front of the sensor, rendered with a VFX graph, together with two other avatars with different pre-recorded animations applied to them.

    I would like to record the motion capture data from the user to then replace the side avatars animations with it. Is there a way to record animations directly with your asset? I tried to implement Cinema Mocap but some folders don’t get along with each other.

    I hope it makes sense!! Please let know if you prefer an email.

    • Hi Alberto. Sometimes I regret and sometimes it makes me feel good, to get that many questions 🙂
      Yes, please send me an e-mail and mention your invoice number. I’m not sure I understand the complexity of your setup. Do you need to create and play these animation clips dynamically at runtime?

  32. Hi Rumen Nice to meet you. Thank you for all your efforts and creating this.

    I just bought the example package but it gives me this error
    Assets/K2Examples/KinectDemos/FaceTrackingDemo/Scripts/FacePointOverlayer.cs(5,17): error CS0234: The type or namespace name `Kinect’ does not exist in the namespace `Microsoft’. Are you missing an assembly reference?

    14 errors in different .cs files

    Is there anything I can do? I already checked the build settings.

    Best Regards!

  33. Hi Rumen I am sorry to bother, I imported according to the instructions, and is giving me less erros now,
    Could you please help me? Maybe can I write to your mail?

  34. (How to make that integration). Additionally, How can integrate body-data recorder and player in the same character(avatar)?

    Thank you very much.

    • The KinectRecorderPlayer-component records or replays the body data of all tracked users. Copy the KinectRecorderPlayer-component from KinectRecorderDemo-scene to the KinectController-game object in one of the avatar demo-scenes, and enable its ‘Play at start’-setting. If you have a body recording, this should be enough to replay it. If you want to record and replay the recording in the same scene, copy KinectPlayerController and optionally SpeechManager-components, as well, and disable the ‘Play at start’-setting.

  35. Hi Rumen.

    I’ve been working with the realsense sensor and the FittingRoom1 as a base project.

    All it’s really really fine.

    So, first of all… greatings because of your assets… (I’ve been working also with the azure asset and it’s really beatiful!).

    I’ve got just one problem that hope that you have an easy solution.

    I explain you the context.

    If I configure the nuitrack config with the same resolution of the depth image and the color image, the body “overlay” always fits perfectly.

    For example, If I use:

    SCENARIO #1: small depth and small color image, without rescaling

    “Realsense2Module”: {
    “DeviceHardwareReset”: false,
    “Depth”: {
    “RawWidth”: 640,
    “RawHeight”: 360,
    “ProcessMaxDepth”: 5000,
    “ProcessWidth”: 640,
    “ProcessHeight”: 360,
    “Preset”: 5,
    “PostProcessing”: {
    “SpatialFilter”: {
    “spatial_iter”: 0,
    “spatial_alpha”: 0.5,
    “spatial_delta”: 20
    “DownsampleFactor”: 1
    “LaserPower”: 1.0
    “FileRecord”: “”,
    “RGB”: {
    “RawWidth”: 640,
    “RawHeight”: 360,
    “ProcessWidth”: 640,
    “ProcessHeight”: 360

    All works perfect!!! The body overlay fits perfect over the real body. But with a “bad resolution” of the RGB background…

    SCENARIO #2: small depth and big color image, with rescaling of the depth image to match sizes

    “Realsense2Module”: {
    “DeviceHardwareReset”: false,
    “Depth”: {
    “RawWidth”: 640,
    “RawHeight”: 360,
    “ProcessMaxDepth”: 5000,
    “ProcessWidth”: 1920,
    “ProcessHeight”: 1080,
    “Preset”: 5,
    “PostProcessing”: {
    “SpatialFilter”: {
    “spatial_iter”: 0,
    “spatial_alpha”: 0.5,
    “spatial_delta”: 20
    “DownsampleFactor”: 1
    “LaserPower”: 1.0
    “FileRecord”: “”,
    “RGB”: {
    “RawWidth”: 1920,
    “RawHeight”: 1080,
    “ProcessWidth”: 1920,
    “ProcessHeight”: 1080

    All works perfect!!! The body overlay fits perfect over the real body. With good resolution of the image… but with a very “bad performance” of the computer (and it’s a very good and new one).

    SCENARIO #3: small depth image and big color image, without rescaling nothing

    “Realsense2Module”: {
    “DeviceHardwareReset”: false,
    “Depth”: {
    “RawWidth”: 640,
    “RawHeight”: 360,
    “ProcessMaxDepth”: 5000,
    “ProcessWidth”: 640,
    “ProcessHeight”: 360,
    “Preset”: 5,
    “PostProcessing”: {
    “SpatialFilter”: {
    “spatial_iter”: 0,
    “spatial_alpha”: 0.5,
    “spatial_delta”: 20
    “DownsampleFactor”: 1
    “LaserPower”: 1.0
    “FileRecord”: “”,
    “RGB”: {
    “RawWidth”: 1920,
    “RawHeight”: 1080,
    “ProcessWidth”: 1920,
    “ProcessHeight”: 1080

    Seems that this scenario would be the best for me: good performance and good color image quality. The problem with this scenario is that the 3d model overlay not matches the color background: it’s exactly one third to the top, one third to the left and one third the scale of the color image… as the difference of sizes: 640*3=1920 and 360*3=1080…

    What can I modiffy to multiply * 3 the position and size of the overlay (or the camera that shows the overlay).

    Really, really thanks!!!

  36. Hi Rumen,
    I tried to work on unity 2018.2 and also unity 2019 althouh there is no any error, kinect doesnt work and initialed. How can i solve this problem?

    Thank you, Best regards,

    • Hi Mehmet, if the Kinect-sensor cannot start, there should be some error in the console, I think. Please also make sure the ‘PC, Mac & Linux Standalone’-platform is selected in the ‘Build settings’. If the problem persists, please e-mail me the Editor’s log-file, so I can take a closer look. Here is where to find the Unity log-files:

    • Hi, as far as I see, the compilation error is in a component of the AllSkyFree-package. It has nothing to do with the K2-asset. You could either analyze and fix the issue, or delete the package altogether, if you don’t need it.

  37. What does ‘WWW’ is obsolete: ‘Use UnityWebRequest, a fully featured replacement which is more efficient and has additional features’ mean?

    • This is Unity deprecation warning, wherever the WWW-Unity class is used for Internet-based requests. You can freely ignore it, as long as it compiles.

  38. Hi,Rumen

    I am a student who want to make the degree project with kinect in unity, I have bought your asset in the unity assets store, it ran perfectly in 2019.2.12 version, but after I updated unity version into 2019.4.16, the kinect scripts don’t work in my project, could you plz help me?

    When I add “kinectManager” this script to my maincamera, it shows error:

    “Can’t add script component ‘KinectManager’ because the script class cannot be found.Make sure taht there are no compile errors and that the file name and class name match”


    • Hi, please look for errors in the console after the update. By the way, as a student you should not buy the app, but contact me and get it from me, free of charge for academic and personal use.

      • Ohh, thanks a lot, you r so nice. But I have bought this already haha.I was trying to solve the errors in console, but there are so many errors, I don’t know if I could solve them:(

      • Try to start with the 1st error. It may give you a good hint of what may be wrong. If you get stuck, please find the Editor’s log-file and e-mail it to me, so I can take a closer look. Here is where to find the Unity log-files:

        The good thing about your purchase is that now you are eligible for support 😉

  39. hey, got mesage :kinect is not genuin” i have no cleu waht that means, eager to try out blob scenes;)

  40. Hi Rumen, due to COVID-19 situation, where all people wears a FaceMask, is there any patch for the FaceTracking Manager? . I have tested all scenes with FacetrackingManager wearing a Face Mask, but doesn´t work. I think because it doesn´t detect the user´s nose.
    Do you have a patch in mind to solve this issue?

    Using KinectV2, Unity running in Windows10 64bits.



    • Hi Oscar, sorry for the delayed response. No, unfortunately the Kinect SDK face tracking model cannot be changed, at least not easily. But, as far as I remember, I’ve seen models for tracking faces with masks that can work with RGB cameras. Please consider researching a bit, and use one of these, if possible.

  41. Hi Rumen

    I have created a project using Kinect v2 Examples with MS-SDK and I want to run it on Unity Webgl platform and I am facing an error, is there any way to fix it? Thanks

      • A WebGL-build suggests your code will run in a browser. The browser doesn’t allow device control or calling DLL functions out of the box. You need to use the server/client approach, where the client (with KinectDataClient-component, see NetworkAvatarDemo1-scene) should run in the browser, while the KinectDataServer-scene should run on the machine, where the Kinect-sensor is physically connected. This may be the same machine, I think. When the client is in the browser, you need to enable the ‘Websocket host’-setting of the KinectDataServer-component.

  42. Hello Sir,

    Can you please tell me unity 2021.2 supports kinect v2 sdk.
    I will lot of try but still it is not works.
    Can you help me how to run kinect v2 sdk run unity V2021.2.

    Thanks & Regards
    Karan Arya 🙂🙂

      • Hello Rumen Sir,

        I have made a video clip, in this you will see what is the problem.
        Please help to solve this problem.
        I am getting worried for quite some time.
        Please help me, sir.

        Thanks & Regards
        Karan Arya🙂🙂

      • The reason for these errors is the deprecation of the High level networking API in Unity. For more info see this link. If Unity 2021.2.0 is not a must, you should be able to load and run it in Unity 2020.3.

  43. Pingback: ⭐️⭐️⭐️⭐️⭐️ Kinect v2 Examples with MS-SDK - Published By RF Solutions -

  44. Hi Rumen.
    The MocapAnimatorDemo doesn´t write the right leg animation to the file.
    I didn´t find the error.
    How to fix that?

    • Hi Ronaldo, I have received your e-mail and I’m researching the issue. Please have some patience. It’s K2-asset specific. The same scene in the K4A-asset works without issues.

      • Yes!
        I’ve already received your e-mail response.
        The solution worked perfectly.

        Thank you very much!

  45. hello,
    I was wondering if the examples include support for multiple people?
    Two people/bodies in the frame at the same time.
    Thank you!

Leave a Reply