Kinect v2 Examples with MS-SDK

Kinect2_MsSDKKinect v2 Examples v2.24 is a set of Kinect-v2 (aka ‘Kinect for Xbox One’) examples that use several major scripts, grouped in one folder. The package contains over thirty demo scenes.

Please also look at the Azure Kinect Examples for Unity asset. It works with Azure Kinect and Femto Bolt & Mega sensors, as well as with Kinect-v2 sensors.

The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture-demos – how to use the programmatic or visual gestures, fitting room demos – how to create your own dressing room, overlay demos – how to overlay body parts with virtual objects, etc. You can find short descriptions of all demo-scenes in the K2-asset online documentation.

This package works with Kinect-v2 sensors (aka Kinect for Xbox One) and Kinect-v1 sensors (aka Kinect for Xbox 360). It can be used with all versions of Unity – Free, Plus & Pro.

If you need a package with similar functionality, components and demo scenes that works with regular camera, please look at Computer Vision Examples for Unity.

One request:
Please don’t share this package or its demo scenes in source form with others, or as part of public repositories, without my explicit consent.

Customer support:
* First, please check if you can find the answer you’re looking for on the Tips, tricks and examples page, as well as on K2-asset Online Documentation page. See also the comments below the articles here, or in the Unity forums.
* If you e-mail me, please include your invoice number. More information regarding e-mail support you can find here.
* Please note, you can always upgrade your K2-asset free of any charge, from the Unity Asset store.

How to run the demo scenes:
1. (Kinect-v2) Download and install Kinect for Windows SDK v2.0. The download link is below.
2. (Kinect-v2) If you want to use Kinect speech recognition, download and install the Speech Platform Runtime, as well as EN-US (and other needed) language packs. The download links are below.
3. (Kinect-v1) If you want to work with Kinect-v1 sensor, please download and install Kinect for Windows SDK v1.8. The download link is below. Please also look at this tip.
4. Import this package into new Unity project.
5. Open ‘File / Build settings’ and make sure that ‘Windows’ is the current active platform, and the architecture is set to ‘Intel 64 bit’.
6. Make sure that ‘Direct3D11’ is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
7. Open and run a demo scene of your choice, from a subfolder of ‘K2Examples/KinectDemos’-folder. Short descriptions of all demo scenes are available here.

* Kinect for Windows SDK v2.0 (Windows-only) can be found here.
* MS Speech Platform Runtime v11 can be downloaded here. Please install both x86 and x64 versions, to be on the safe side.
* Kinect for Windows SDK 2.0 language packs can be downloaded here. The language codes are listed here.
* Kinect for Windows SDK v1.8 (Windows only) can be found here.

Documentation:
* The online documentation of the K2-asset is available here and as a pdf-file, as well.

Downloads:
* The official release of the ‘Kinect v2 with MS-SDK’ is available at Unity Asset Store. All updates are free of charge.

Troubleshooting:
* If you get errors like ‘Texture2D’ does not contain a definition for ‘LoadImage’ or ‘Texture2D’ does not contain a definition for ‘EncodeToJPG’, please open the Package Maneger, select ‘Built-in packages’ and enable ‘Image conversion’ and ‘Physics 2D’ packages.
* If you get compilation errors like “Type `System.IO.FileInfo’ does not contain a definition for `Length’”, you need to set the build platform to ‘Windows standalone’. For more information look at this tip.
* If the Unity editor crashes, when you start demo-scenes with face-tracking components, please look at this workaround tip.
* If the demo scene reports errors or remains in ‘Waiting for users’-state, make sure you have installed Kinect SDK 2.0, the other needed components, and check if the sensor is connected.

* Here is a link to the project’s Unity forum: http://forum.unity3d.com/threads/kinect-v2-with-ms-sdk.260106/
* Many Kinect-related tips, tricks and examples are available here.
* The official online documentation of the K2-asset is available here.

Known Issues:
* If you get compilation errors, like “Type `System.IO.FileInfo’ does not contain a definition for `Length“, open the project’s Build-Settings (menu File / Build Settings) and make sure that ‘PC, Mac & Linux Standalone’ is selected as ‘Platform’ on the left. On the right side, ‘Windows’ must be the ‘Target platform’, and ‘x86’ or ‘x86_64’ – the ‘Architecture’. If Windows Standalone was not the default platform, do these changes and then click on ‘Switch Platform’-button, to set the new build platform. If the Windows Standalone platform is not installed at all, run UnityDownloadAssistant again, then select and install the ‘Windows Build Support’-component.
* If you experience Unity crashes, when you start the Avatar-demos, Face-tracking-demos or Fitting-room-demos, this is probably due to a known bug in the Kinect face-tracking subsystem. In this case, first try to update the NVidia drivers on your machine to their latest version from NVidia website. If this doesn’t help and the Unity editor still crashes, please disable or remove the FacetrackingManager-component of KinectController-game object. This will provide a quick workaround for many demo-scenes. The Face-tracking component and demos will still not work, of course.
* Unity 5.1.0 and 5.1.1 introduced an issue, which causes some of the shaders in the K2-asset to stop working. They worked fine in 5.0.0 and 5.0.1 though. The issue is best visible, if you run the background-removal-demo. In case you use Unity 5.1.0 or 5.1.1, the scene doesn’t show any users over the background image. The workaround is to update to Unity 5.1.2 or later. The shader issue was fixed there.
* If you update an existing project to K2-asset v2.18 or later, you may get various syntax errors in the console, like this one: “error CS1502: The best overloaded method match for `Microsoft.Kinect.VisualGestureBuilder.VisualGestureBuilderFrameSource.Create(Windows.Kinect.KinectSensor, ulong)’ has some invalid arguments”. This may be caused by the moved ‘Standard Assets’-folder from Assets-folder to Assets/K2Examples-folder, due to the latest Asset store requirements. The workaround is to delete the ‘Assets/Standard Assets’-folder. Be careful though. ‘Assets/Standard Assets’-folder may contain scripts and files from other imported packages, too. In this case, see what files and folders the ‘K2Examples/Standard Assets’-folder contains, and delete only those files and folders from ‘Assets/Standard Assets’, to prevent duplications.
* If you want to release a Windows 32 build of your project, please download this library (for Kinect-v2) and/or this library (for Kinect-v1), and put them into K2Examples/Resources-folder of your project. These 32-bit libraries are stripped out of the latest releases of K2-asset, in order to reduce its package size.

What’s New in Version 2.24:
1. Upgraded the ‘Kinect-v2 Examples with MS-SDK’ package to Unity 6.0.
2. Fixed the incorrect-face-rectangle issue in the ShowFaceImage-component (KinectFaceDemo3).
3. Cosmetic changes in various components and demo scenes.

Videos worth more than 1000 words:
Here is a mixed-reality interactive experience that takes the visitors to beautiful jungles, landscapes, safari and other imaginary worlds, created by ‘YOYO Events’ and ‘AnimatedStudio’:

Here is a mixed-reality interactive experience that takes the visitors to different worlds, different times and dimensions, created by ‘YOYO Events’ and ‘AnimatedStudio’:

Here is a video by Ricardo Salazar, created with Unity5, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3:

..a video by Ranek Runthal, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3

..and a video by Brandon Tay, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.0:

811 thoughts on “Kinect v2 Examples with MS-SDK

  1. Hi Rumen, i have a question about gesture accroos multiple scene.
    I have multiple scene, during scene 4 (for example), i had a gesture to the primary user by script (KinectManager.detectGesture). Then, i move to scene 5, is the gesture added still detected ?

    • Hi Renaud, it would be good if you call DetectGesture() when a new user ID gets detected. With multiple scenes, if your KinectManager is created properly – once at startup, and is available across the scenes, I don’t think you need to call it again, but just experiment and you will see. Anyway, calling DetectGesture() multiple times won’t do any harm. If the gesture detection is initialized for that user, the call to DetectGesture will be ignored.

  2. Hi Rumen,

    With your setup, would it be possible to send the Kinect data over a network to a client on another machine instead of on the same machine?

  3. Hi Rumen,

    I was wondering if you’ve had any success overlaying a skeleton on top of a color view. I am trying to do just that, but I am missing something critical, I think.

    I take the color view and display it on an orthographic camera (5.4 size, the cube displaying the view is scaled to 19.2×10.8), and then a perspective camera is overlaying the joint positions on top of that. I create the joints in 3D space at the position the joint is from the Kinect. The Kinect is 1 meter off the ground, and similarly my perspective camera is at 0,1,0.

    The overlay is close, but not quite there, and if I stretch in any direction, the joint position looks especially wrong.

    Do you have any advice?

    Regards,
    Joe

    • Hello Joe,

      I’m not so much of an expert in overlays, but why the first camera is orthographic and the second one – perspective? The color view should be only background of the scene. Also, take a look at the KinectOverlayDemo. Maybe it will help.

      Greetings!

      • Hi Rumen,

        Thanks for the quick response! I made the first camera orthographic in order to make the color view the background. The perspective camera draws on top, so the joints always appear above the color frame.

        I actually got the skeleton to overlay in 2D by transforming the camera space points to color space points, and then creating the joints on the orthographic camera at their position in color space. But it would still be good to get the skeleton working in 3D…

        I don’t see a KinectOverlayDemo in the project. Was this a newer addition to your package?

        Thanks,
        Joe

  4. Hi Rumen,

    How can I have a copy of the package? We need to incorporate Kinect v2 to our Unity Projects.

    Regards,
    Eugene

  5. rumen i have a question, im a beginner at this so sorry if it is a simple question. What is the difference between the Kinect v1 and v2? and if i use v2 can i work with Kinect for Xbox 360 or do I have to use kinect for windows v2?

      • uhm I am looking for the sdk for the Kinect for xbox and can you send me a link of a tutorial on how to make that work on unity? thanks for all the help. Im using these for a project that I am making in untiy

      • oh i almost forgot, I already have a xbox kinect, but I was thinking of buying the v2 of kinect for windows, would that make it better and easier for me during development and can the difference between the two become noticable in the game i am developing. Or is the xbox for kinect sufficient enough for a game that is only for a project in my university?

      • Rumen I need some help in making my own movement to be recognized by the kinect. Do you know any source that gives tutorials on how to do that? I have tried to google but all i found are the basics in kinect programming. Can you help me?

      • no I mean I dont know how to make my game react to certain movements by the player like when the player stretches his left arm slightly upward then the game will pause. How do I do that? recognizing the movement and making it like a button to a controller 🙂

      • You need a gesture listener, like the GestureListener-script in KinectGesturesDemo-scene. There is also a short tutorial ‘Howto-Use-Gestures-or-Create-Your-Own-Ones.pdf’ in the Assets-folder. As to the pause, there is a Stop-gesture, but it seems unfinished. If you really need this gesture, please contact me via e-mail and I’ll try to provide more reliable code.

      • Hello again rumen 😀 I am studying the pdf right now and trying to understand it. sorry for being such a beginner. I do have another question, I have watched some gameplay videos of kinect on youtube and I really liked that cursor being used in the games, the one with the hand and a circle around it. once the cursor hovers the button, the circle will fill up white and when it finishes a click event will occur, does the microsoft sdk have that cursor? if not where can i get one of those? I will use it on the menu of my game

      • Hey Franz, I know this cursor you mean, from the Xbox-games. Unfortunately I’m far from designer and you can see by yourself – the cursors in my demos don’t look nice at all. The progress circle is also missing. If you find such or similar looking cursor for Unity, please send it over to me too 😉

      • hey rumen 😀 I have found this article in coding4fun website and I think this is what we are looking for based on the things I have read in the article. But my question is they are using a coding4fun kinect toolkit. Will you be able to apply this in yours? If so that would be great! 😀 I have no idea how to get this to work with the microsoft sdk so if you can help me that would also be great 😀

      • have you seen my post rumen? I need help on making this work on the unity plugin of kinect 😀

      • Hi Franz, yes I’ve seen your posts and have them on my to-do list, but there is no time at the moment to deal with animated cursors. Unfortunately, there are more urgent issues to solve. How can I help you to make them work?

      • im sorry for disturbing you rumen. I think the links i have given you are in wpf? how can i use it in unity?

      • I’ve researched today the cursor stuff a bit. I think the wpf-cursors are not directly applicable in Unity, but you could use a similar approach. The cursor is a Texture2D, no matter if you would use a GUITexture or Sprite. The progress of the Click gesture is known at any time (between 0f and 1f, which means 0%-100%). There is also information in Unity-answers how to draw a circle on a Texture2D and how to use a dynamic texture in SpriteRenderer (or on GuiTexture). All you need to do is research it a bit, think it over and implement it in KinectManager or into your own gesture listener. 🙂

      • hi rumen, i have a quick question, is it possible to create a dynamic gesture like running, jumping while hands are up in the xbox 360 kinect and ms sdk v1? if so do you know of any tutorials of that? i seem only to find simple gestures on the net like hand above the head gestures

      • Hi Franz, well, there are no gesture recognition functionality in SDK-v1, as far as I know. This means you need to implement the gesture recognition by yourself, do it in any way you like and detect any kind of gestures. If you can write gesture-recognition function that recognizes running or jumping, you’re free to do it 🙂 By the way, I’ve implemented recognition of jumps and squats into “Kinect with MS-SDK” and “Kinect v2 with MS-SDK”-packages.

      • I am using your kinect with MS-SDK so the jump gesture will not be a problem anymore, I just have to do the running gesture now 🙂 thanks!

      • this is from the jump gesture:
        Jump – the hip center gets at least 10cm above its last position within 1.5 seconds.

        so if i want to use running gesture then i will have to use a code that gets the knees center position above lets say 20cm within 1.5s too? but the knees need to be alernating up 15cm and back to its original position. Is it possible to recognize going 15cm up and down and recognizing both knees doing that at alternating times?

      • The gesture recognition process is a state machine. Try to describe your “run gesture” as set of states. For instance, the distance between the knees in Y-direction to be at least XX cm (state 0) would suggest that maybe the user starts running. Then if the knees change places, i.e. the previously lower one should become higher and vice versa, while making the Y-distance between them again at least XX cm within Y.Y seconds (state 1). Then they should change places again (repeat state 1). To make sure this will work, try to record time and positions of the both knees in a file while you run. Analyze this data and you will find out what the initial values XX and Y.Y should be. After that you can fine tune them in the process of implementing and testing your gesture.

      • rumen, what will i need to copy from the avatars demo to my own scene for me to be able to use the cursor only in my own scene. I am working on my start menu and i only need the cursor to follow my right or left hand. I already copied the HandCursor, the KinectManager and the Simple Gesture Listener and placed them in my main camera also but the Hand cursor does not follow my hand during play.

      • You need to copy the Assets/KinectScripts-folder and Assets/Resources-folder to your project. Then add KinectManager and InteractionManager to the MainCamera in the scene and set up the cursor textures (settings of the InteractionManager). You don’t need gesture listeners to control the cursor.

      • I have gotten it to work now 🙂 thank you. another question, is the click gesture in the mouse works as a mouse click also or do i have to edit it to work as a mouse click also to work with a button?

      • Also rumen the HandCursor seems to be moving under other game objects such as buttons sliders, is that the reason its not able to click the button?

      • i think the problem is that the gui button is being created last and is on top of the cursor. I dont see anyway how to make it vice versa where the cursor will be created last and be on top of the gui.

      • I have been able to make it work now, but the game object Hand Cursor does not do mouse click. I can only call the event mouse click when i check the box Control Mouse Cursor on the Kinect Manager, but the HandCursor Game Object doesnt call the mouse click event

      • Yes, you need to enable ‘Control Mouse Cursor’ in order to make Click gesture interact with the UI in the standard (mouse) way. In this case it would be better to disable the hand cursor gui-texture or sprite. You’re also right the cursor goes below the other GUI. I would suggest 2 possible work arounds: 1. to put all GUI code (including cursor rendering) in one OnGUI()-function, where the hand cursor is rendered last or 2. change the script execution order with GUI rendering – before InteractionManager that renders the cursor.

      • Sorry for the delayed responses, but there are holidays now. Merry Christmas, if you celebrate it. You can send me e-mails, if you need (a bit) faster responses.

      • rumen where do i put my codes that will make my model character move forward when the gesture is detected. I cant seem to find it. 🙁

      • You can either check in your script, whether a gesture is detected by using ‘kinectManager.IsGestureComplete()’-function or via gesture listeners – look at GestureListener in GesturesDemo-example or SimpleGestureListener in AvatarsDemo.

      • Merry Christmas too rumen and a Happy new year 🙂 Question: in the SimpleGestureListener, I do not need to add anything in the GestureProgress function? and I just need to add an else if(gesture == “my gesture”) to create an action if the gesture has been recognized?

      • I am now able to make my character move using the SimpleGestureListener but I have a problem, my character instead of slides into the next position, it blinks into the next position whenever the gesture is detected. Do you have any ideas of how i can make it so that the character looks like its moving/running rather than blinking into the next position.

      • rumen, does the kinect manager block rotation from scripts? i have tried rotating my character using transform.rotate and rigidbody, but during play mode, when the gesture is detected, the only thing that rotates is the camera and it rotates around the character. What i want to do is have the character rotate with the camera at the same time

      • Nope, the KinectManager doesn’t block anything. The full source code is there, so you can check this by yourself. Probably there is an AvatarController attached to your character that rotates the body (and its joints) concurrently to your script. If this is the case, just remove this component or fine tune it by commenting out the joints you don’t need to be controlled by your movements. I.e. comment out the unneeded rows in boneIndex2JointMap and boneIndex2MirrorJointMap hash-maps, for instance ‘{0, KinectInterop.JointType.SpineBase},’.

    • I think i found whats causing my problem. I wrote the code on the GestureCompleted function not in the Update function which is called every frame. That is whats causing the blinking in my characters position am I right? if so, can you suggest any solution on how i can use Update function to move my character when gesture is completed?

      • You can use flags, set them when a gesture is detected in the GestureCompleted(), then check and clear the respective flag/variable in Update().

      • rumen i am creating a pause menu for my game. At the start of the level the Control Mouse Cursor will not be true. But when the player raises his left hand the pause menu will show up and the Control Mouse Cursor should be set to true. It gets set to true but still the Cursor wont follow the player’s hand, what am I missing there?

      • Hi Franz, I don’t know 🙂 You can zip your project (or a sample scene demonstrating the problem), share it online and send me a link per e-mail. Then I’ll take a look and try to find out what went wrong.

      • Hi rumen, I have found the problem why it doesnt control the mouse, I forgot that I set the Time.TimeScale = 0; thats why I cant control the mouse. This creates another problem. I cannot create a pause game if i dont use “Time.TimeScale = 0;” and if i use that i cannot control the mouse and any animations in the game. Do you have any suggestion how i can make a pause game?

      • I have thought of a solution and I wanted to know if you think it will work:

        the solution is just prevent all other gestures from being detected except click and ControlMouseCursor. While setting the character I am using velocity to 0 so that it is not moving and animation of it to idle.

      • Another question:

        How do I make a copy of SimpleGestureListener.cs and rename it. I tried to copy the codes to a brand new script but it doesnt seem to work with the kinect manager. what did i do wrong there?

      • Make sure your class implements GestureListenerInterface, like this: ‘public class MyGestureListener : MonoBehaviour, KinectGestures.GestureListenerInterface’. Then, when your scene is starting, clear the GestureListeners-list setting of KinectManager and add the new one programmatically, like in the SetSceneAvatars-script.

      • The reason for the above question is that I modified the SimpleGestureListener and have some public game objects such as animators, rigidbody and the like which is in my level 1 scene. The problem is that I also use the SimpleGestureListener for my main menu scene for the click gesture. Then the public game objects in my script will be null because it is in my level 1 scene. So I need a copy of the SimpleGestureListener(renamed) for it to work.

      • I did as you said to use the SetSceneAvatars and made a duplicate of it so i can modify the code to place gesture listeners instead of avatar controllers, the script gets added but it wont control the mouse even if the ControlMouseCursor is set to TRUE. the gestures get detected but it doesnt control the mouse. What do you think the problem is?

      • Franz, are you using the Kinect-v2-package or the older one? In the K2-package the cursor is controlled by InteractionManager, not by gestures. Also, if you’re using the K2-asset, please download the latest version. There are some improvements related to mouse-cursor control and SetSceneAvatars (gesture listeners).

      • Im using the older one, the v1 kinect xbox 360. not the kinect v2. Before I did the Kinect Manager for mulitple scenes, I can control my cursor in the Main Menu Scene of mine and my pause menu. Now it doesnt get controlled and I cant seem to find the problem. The gesture listener is being connected to the Kinect manager done by the setsceneAvatars script and can detect gestures as shown in the GestureInfo object during play mode but the cursor wont move at all

      • Rumen I have sent you an email with the link to a unity project that recreates my problem, I used the avatars demo and created a StartUpScene. Please try it out and hope we can find a solution or alternative to make that work

  6. Hi..we are using previous version of avatar solution. But we are facing issue on rotation . We are working for 360 degree rotation of avatar .

    Can you give access to your BitBucket for latest copy 1.1 ? it will more helpful us to work on latest version of your package..

    Thanks
    Jegan

  7. Hi May I ask you a question ,How can we call the method of VisualGustureBuilder.dll that KinectSDKpreview provided (or How to use adgtech)

    • Hi there, if you mean in Unity and your app is 64 bit, just use InteropServices.DllImport-directive to declare the external function. Otherwise, wait until the MS team adds this functionality to the Unity plugin.

  8. Hi man Thnx for your work. I couldn’t get familiar with the code so i need you to help me in unity to make something simple to start with for my school project like just cube in the scene following your hand. I would appreciate your help sooo much .

  9. Hi thnx for your work. I need help in unity for my school project because I couldn’t get familiar with the code I need something simple to start with like just a cube in the scene following my hand. I would appreciate your help a lot.

  10. Hi!Highly praise.Does it work in win7 64x?curiously,i want to buy the Kinect v2 to develop in win7 64x unity,
    or i have to use win8?

  11. Hi Rumen:

    I sent you an email but maybe is better from here. The question is:

    Is multiplayer supported? When one person enters in the scene, a avatar is attached to him, but when the second person joins the scene the second avatar attaches with the first person too. Is this normal?

    • Hi, you can use ‘Player Index’-setting of AvatarController-component to specify which player controls this avatar. The default value is 0, which means the first found player. A value of 1 means 2nd player, 2 – the 3rd one and so on.

      • Hi Rumen:

        Can I use multiplayer using Cubemans instead Avatars? I figured that cubeman doesn’t have “playerIndex” setting, but maybe adding it to it could work?

        I guess it was better to do multiplayer using Avatars at first, but all my code and design it’s done for Cubeman.

        Greetings,
        Marcos.

      • Hi Marcos,
        Yes, why not. I think you just need to add playerIndex-setting to the CubemanController and then, instead of GetPrimaryUserID(), use GetUserIdByIndex(playerIndex).

  12. Are there any plans for voice recognition / commands with the Kinect-v2 with ms-sdk bundle for unity? if so, any tentative timing?

    • Because, there was no documentation about what exactly the SDK-quaternions meant at the time I created this project. It was easier for me to calculate them by myself. But you are free to modify the scripts and use the SDK-provided orientations. The full source code is there.

  13. Hi, Rumen, I am student of computer science and I am working on a project for head tracking in unity 3d using kinect v2 but till now I have totally failed, can you please help me doing so.
    Thanks in advance.

  14. Hi, I have a problem with KinectFaceTracking. When I run the demo scene , I get the “Not tracking” debug . Nothing happens later . I use KINECTv2, other scenes works fine.

    • First of all, check if the SDK face-examples work OK, for instance if ‘Face Basics-D2D’, ‘Face Basics-WPF’. ‘HD Face Basics-WPF’ work. If they don’t, look if your graphics adapter is NVidia. If it is, update its drivers to the latest version from NVidia website. Then try again.

  15. Is there an easy way to access the Microsoft.Kinect.Face namespace in Unity?

    To access the scripts in Standard Assets>>Microsoft>>Kinect>>Face ??

    Like the faceproperty enum for example? Also what about activities?

    I remember in an older version of your unity assets it was a lot easier to access that stuff? It’s not in kinectinterop, so where would I find it? Thx a ton, awesome updates.

    • Yes 🙂 Sorry for the complications, it was needed, in order the package to work with both Kinect2 and Kinect1. To get the K2-specific interface and its variables (make the ones you need public), you can use something like this:

      KinectManager kinectManager = KinectManager.Instance;
      KinectInterop.SensorData sensorData = kinectManager.GetSensorData();
      Kinect2Interface k2interface = (Kinect2Interface)sensorData.sensorInterface;

  16. Hi, I have some issues with hands and head rotation. Hands looks erratic and innatural, head does not rotate on the y axis. This happens both on your demo scene and on an brand new scene with an humanoid model. Is this an issue you’re aware?

    • Hi there, yes I’m aware. For the head, you can use the head-rotation, reported by the face-tracking manager. I’m working on wrists’ rotations now, but it’s interesting to me, what exactly do you mean by ‘erratic and unnatural’. Please contact me by e-mail and send me a short video or some screenshots, demonstrating the issue, if possible.

      • Thank you for the fast reply and sorry for the big delay.
        if you write me down your e-mail I will send you a short video.

        In the meantime… I noticed that the “progress” float value returned by the
        GestureInProgress(long userId, int userIndex, KinectGestures.Gestures gesture,
        float progress, KinectInterop.JointType joint, Vector3 screenPos)
        method is 0.5 all the time. Would be great if that method would return the real gesture progress.

      • No problem. Please download and try the latest package update first. As to the gesture progress question: Only some gestures return realistic progress. These are the poses that require some time to be completed, like raise-hand, psi-pose, t-pose, and stop-pose. All the rest use the progress to specify the state they’re currently in.

  17. Hi Rumen,

    I put my email to you as the current result of your updated version, some of feature is still failed, and i put a screenshots to you. please correct me if there are some mistake.
    Thanks a lot..

  18. Hi roumenf, thanks for shared us the Kinect with MS-SDK, I have a big problem with the mouse control when I work with the new feature canvas GUI buttons, the cursor try to go away from the button every time I try get close to it, how I can fix that?, becomes frustrating at times trying to place the cursor over the button, thanks very much for you support.

    • Hi zaniocz, thank you for reporting the issue. May I ask you to contact me by e-mail and, if possible, send me a short video depicting your problem. It would be also great, if you could create a sample project for me, zip it, share it online and send me a link. Is the issue related to “Kinect with MS-SDK” or “Kinect v2 with MS-SDK”?

  19. Pingback: Kinect SDK 2 FINGER TRACKING (etc) with Mac OS X, Windows, Desktop and Large Screens (VR) | Erik Champion

  20. Hi Rumen:

    I still have problems with multiplayer =(. I’m using cubemans but at the beginning, they are not on the scene. When someone is detected, I instantiate a cubeman and the player controls it, but when a second player enters the scene, a new cubeman is instantiated but the first player controls both of them (the first and the second cubeman).

    I don’t know how to fix this =(, do you have any ideas?

    Thank you again,
    Marcos.

    • This means that you either use playerIndex=0 for both (instead of 0 and 1) or you haven’t changed the source code properly 🙂 Look at my previous comment.

  21. Hi Rumen,
    I notice that there is an option in the “AvatarController” script says that I can set the offset according to the sensor. I tried that out, the result is somehow not accurate enough. I need the accuracy like the “Green Ball” example. How can I achieve that accuracy ? Thanks in advance.

    • offsetRelativeToSensor-setting means that the initial position of the avatar in the scene will not be preserved, but changed, according to the real position of the user in Kinect’s coordinate system. If you need to align your avatar to the color camera image, look at the KinectOverlayer-script source and how vPosObject is calculated.

  22. Hi Rumen,

    Just downloaded this package but I’m having troubles running the speech recognition scene, The scene is giving me: “Error Initializing Kinect/SAPI: 0x80040154” as an error message. I’ve also installed the Kinect for Windows language pack, do you have any idea what causes this error?

    Thanks

    • Hi Rumen,

      Found out what the problem was, even though I had the Kinect SDK downloaded I also had to download Kinect Runtime, it works great now.

      Thanks

      • Hi Karl, wow, congratulations! Do you mean you had to install the Kinect Runtime AFTER you installed Kinect SDK? Was your SDK the latest one?

      • Hi Rumen,

        Yeah I downloaded Runtime after I installed the Kinect SDK and it started to work fine, and my version of the Kinect SDK is up to date, but the strangest thing was that it didn’t work on Runtime V2, I had to use V1.8. Honestly if I was to hazard a guess, I would say it was something to do with the language Pack not working correctly on Runtime V2.

        Any way I’m happy now that it works, my eyes lit up when I saw that robot run when I shouted “Forward” 🙂

        Thanks for Everything

        Karl

  23. Hi again Rumen:

    I’m still working on the multiplayer with instances… I almost get it but I have the next problem.

    When the first player enters on the scene, he’s the player 0 (playerIndex = 0). When the second one enter on the scene, he’s the player 1 (playerIndex = 1). If the player 1 leaves the scene and joins again he will be the player 1 again, that’s all correct and is working nice.

    But here is my problem:
    – First player joins the scene and gets for example 5 points –> He’s Player 0.
    – Second one joins and gets 3 points –> Player 1.
    – First player leaves the scene (with the five points) –> Player 1 now have 5 points instead of 3.

    I hope I have explained right myself… and if you have a solution I’ll be glad.

    Greetings,
    Marcos.

  24. Hi Rumen!

    I have been trying to use the sdk with other plugins in order to control a multiplayer game for 2 users. However, whenever there is more than 2 people in the room, interference happens and the control turns out not clear. Is there any way with your plugin to choose 2 users to be tracked and ignore the ones watching?

    Thanks!!

    • Kinect v2 can track up to six users simultaneously. If you ask about K1, there are really two fully tracked users out of six. I think the SDK selects the tracked users, depending on the distance to them.

      • Hello, I have the same problem: I need to know the ID of the user that controls the mouse cursor, while other users should do nothing until the first user goes away. I tried with InteractionManager.getUserId(), but it works just when there is a single user. When multiple users are in front of the Kinect controlling the mouse cursor is a pain. How to solve?

      • Hi, InteractionManager uses the primary user ID, which it gets from the KinectManager. You can also set this ID by invoking SetPrimaryUserID(). The new user ID remains valid until the user is detected by the Kinect. Then, a new user will be selected (usually the closest one), if you don’t set one by yourself.

  25. hello. in the interaction demo for example, i can in the interaction manager control mouse cursor, but how can associate the CLICK of the mouse?
    thank you

      • hello. thank you very much for the reply.
        I tried but i couldnt do it.

        In the ” interaction manager script” we can control the mouse movement.
        But how can i CLICK ? (example: left mouse button).
        Another example: I can control the mouse cursor over a button, but how can i click on it.
        I saw in the microsoft tutorial they press the button by holding the hand cursor over the button for 2 seconds, for example.
        Maybe its possible to activate the mouse click by closing the hand?
        So sorry to disturbe you, but this is importante, because if we can drag the mouse cursor and click on buttons and objects the possibilities are endeless.
        sorry once again for any inconveniance.
        Hope you can help.
        Best regards

      • Ah, I understand now. The Click detection is the same as you describe – just don’t move your hand (and the cursor) for about 2 seconds. You can check this in the InteractionDemo. It will display LeftClick or RightClick, depending on the hand you use. To click on the GUI buttons you also need to enable ‘Control mouse cursor’-setting of the InteractionManager. Hope this helps.

  26. Uau!! it works. amazing this package.
    hope that you can do more demo scenes, and put it on sale.
    thank you once again.

  27. Hello. Sorry for the disturb again.
    Is there a way to replace the mouse click (by not moving the hand for 2 seconds) for Left Grab ?
    Because the movement of closing the hand is much faster for a mouse click than holding still for 2 seconds.
    thank you very much

    • You need to modify the code of the InteractionManager, to generate MouseControl.MouseClick() when a hand grip is detected. Don’t be afraid to experiment.

      • Thanks for your reply,
        Actually I bough the package for this reason.
        you are right, I dont know if you can help me with that, if Microsoft allow access to the nightly build.
        Because i am working on an installation that requires heart rate and muscle mapping, which is not yet released with the current latest SDK !

      • Hi Firas, You can copy and customize the KinectManager.UpdateUserHistogramImage()-function to get the functionality you need. The userMap-variable contains the user index at the current pixel (255, if no user is detected there). Then put only the pixels belonging to one user in the resulting texture. You can use the GetBodyIndexByUserId()-function to get the index of the user, whose pixels you need to copy.

  28. Hello,

    Does your module fit Oculus Rift DK2? I have problems testing your old version to link with Oculus.. because Oculus project has two main cameras. If your new version is compatible with Oculus project, I am willing to buy it. Thanks!

  29. Congratulations Rumen; it is a awesome pack!
    I cannot display only the skeleton lines on screen without displaying the user map by changing the Kinect Manager Parameters, though… I can display only the user map, and both, but not only the skeleton. Would you know what should I do?

    Thanks!

    • Sorry, I cannot quite understand. You can see the skeleton lines on the cube-man, without the user map, etc enabled. By the way, you can use the user map without displaying it on screen, if this is the problem. To do it, disable the ‘Display user map’-setting.

  30. hi there.
    it works very nice. all the samples are great.
    learning a lot.
    can i change the “mouse click” that is holding the hand still, for “hand close”?
    i can move a slider, for exemple, because when a move the hand it stops the “clicking” state.
    thank you

  31. HI Rumen,

    Is there a way to get the usersLblTex for each body separately, the reason for that is, I want to make start and end particle effect for each new user. based on usersLblTex data

    • Rough height you can get as (Head.pos.y – Foot.pos.y + some offset to compensate the head). For accurate height, I would suggest to process the body-index image and then map the highest pixel (min y) and lowest pixel (max y) of the user to space coordinates, and find the difference between them.

      • Thanks for your reply
        Actually I need an accurate height,
        i will try out the second functionality you metioned
        I have another question, what is the easy way to get the face expressions ? if its possible ?

  32. Hello,
    I would like to ask, if it’s possible to detect, whether the device is not connected (either K1 or K2). I tried to use function “IsKinectInitialized()” and “IsInitialized()” from the KinectManager script, but even though no device is connected, those fucntions return true. I can’t find any other suited function.

    Thanks for your response.

    Milan

    • This code should do the job:

      KinectManager manager = KinectManager.Instance;
      if(manager && manager.IsInitialized())
      {
      KinectInterop.SensorData sensorData = manager.GetSensorData();
      int sensorsCount = (sensorData != null && sensorData.sensorInterface != null) ? sensorData.sensorInterface.GetSensorsCount() : 0;

      // sensorsCount == 0 means no sensor is currently connected
      }

      • Add to previous comment: I created getter for sensorData and when I write the sensorsCount, it always writes “1”, even when no Kinect is connected.

      • You can modify the Kinect2Interface.GetSensorsCount()-function to this:

        public int GetSensorsCount()
        {
        int numSensors = 0;

        KinectSensor sensor = KinectSensor.GetDefault();
        if(sensor != null && sensor.IsAvailable)
        {
        numSensors = 1;
        }

        return numSensors;
        }

        The little issue I find with it is that the IsAvailable-property will not change if you unplug the sensor during the game. You need to restart the game to find out the sensor is no more available.

      • Thanks for your response. I changed the GetSensorCount() function of Kinect2Interface. Unfortunetely, now the IsAvailable property is mostly “False”. It was “True” once for unknown reason, but when I restarted Unity, it was “False”, even though nothing was changed and Kinect was still connected. (Source codes are available on project’s website, but without updated GetSensorCount() function.)

      • Then, it wont hurt, if you research a bit the issue too and do some tests by yourself. I work alone and don’t have unlimited free time to research non-critical issues.

  33. hello there !!
    hope you can make FBX or BVH recording.
    your 2.3 package is the best i ever seen !!
    thank you very much

  34. Hi Rumen. Great asset! works amazingly! One thing. Im trying apply animation to the arms of my avatar ( for use with the Leap motion to provide IK for the arms) but when I add the animation and apply a mask to it. the rest of the body is still reset to the bind pose when masked. can you suggest a way around this with your asset. I found a comment that suggests a solution with the microsoft sdk

    http://forum.unity3d.com/threads/animation-legs-mecanim-movement-arms-kinect.200737/

    Really appreciate anything you can suggest!

  35. Hello. Great assets, save a tons of works, nearly all the binding is done. BUT. In face frame result, why have you pruned out Microsoft_Kinect_Face_FaceFrameResult_get_FacePointsInColorSpace out of the implementation ? The function pointer seems at least present in your KinectFaceUnityAddin.dll

    Is there any way to get it back ? The main reason I bought this asset was to be able to use the full Kinect2 api in Unity without having to bother about linking things myself, and even better if some example scene were available and ready to use. Seeing a key feature of the API missing is a bit… disapointing to say the least.

    Best regard,

    A user happy but unhappy at the same time.

  36. This is the effect I get when applying animation to the arms of a the avatar with the avatarclassic script attatched.

    I know your ignoring me now, but I tried to send the package but it was too large, so I created a short video.

    I changed update to lateupdate but that’s as far as ive gotten. it does give more movement but its still fighting the bind pose.

    Please help.

    • Why ignore you? I just asked you 3 days ago to zip your project (or prepare a sample project for me), share it online (i.e. in Dropbox, Google drive, One drive, etc.) and send me a link, so I could check what’s wrong and try to find a workaround. I wonder what was here so difficult to understand. Thanks for the video, but what am I supposed to do with it? Ok, I’ll prepare a sample scene on my own, when I find some time.

      • Hi, thanks for your continued condescending tone. I really appreciate it. I did explain I was struggling to get a package over to you. Really appreciate your response!! :/

      • It is essentially your product with an animator attached as I have tryed to explain about 5 times… I’m not sure why this would need me to build a package. As I explained I was struggling to get one the right size to send. But your condescending tone is really useful and just what I needed!

      • I do appreciate you being willing to help, but your responses have not been very helpful. I will create a package that will be your avatar demo scene with an animator applied to the arms today. If you have time to look at it before the end of the week that would be great. if not….then not

      • Yes, it is not easy, but also not impossible to solve. Here is what you can do, for a start:
        1. Use the AvatarControllerClassic and assign only these joints that need to be controlled by the sensor. Set the SmoothFactor-setting of the AvatarControllerClassic to 0 to apply the bone orientations immediately.
        2. Create an avatar-body-mask and put it to the animation layer. It should disable the animation of the same Kinect-controlled joints. Also, do not disable the root-joint.
        3. Now the most difficult part. AvatarController’s updates needs to be invoked in LateUpdate. To do it you need to modify a bit the KinectManager-script. Open it and add ‘void LateUpdate()’-function after the Update()-function. Now, find in Update() this loop: ‘foreach (AvatarController controller in avatarControllers) { … }’. Copy this whole loop in LateUpdate() and comment it out in Update().
        4. Run the scene. When the player gets recognized, part of his bones will be controlled by the Kinect and the other part – by the Animator.
        5. If it works so far, please continue tweaking and improving it on your own.

        p.s. Try to import in a new project the unity-package you sent here. Anyway, it is of no importance any more.

      • Hi Thanks for taking a look. In the video ( I know now you couldn’t see) I had done the same thing and I get the same issue following your instructions. The Kinect is still fighting with the bind pose…Did it work your end?

  37. Hello there Rumen.
    I dis 3 school projects with the help of your unity package:

    https://www.youtube.com/watch?v=S8VSfL-oHCs
    https://www.youtube.com/watch?v=dlrfMJAs2-k

    Amazing !! my all school love it !!!

    I need a little help : (
    i´m doing a “Drum”, hand cursor with “InteractionManager”, and when i close my hand i play the drum !! everything works great, but …. i want to remove the ability of clicking when my hand is not moving !!
    because if i stay still with my hand over the “drum” it plays !
    how can i please remove that click or go from 2 seconds to 10…. for example.

    thank you once again for everything.
    like i said, theres nothing !! nothing in the internet that helps so much as your work, neither microsoft it self… doesnt give much support.

    • Hi Ricardo, congratulations on the projects! To your question: Just open the InteractionManager and comment out the blocks of code, where ‘isLeftHandClick = true’ or ‘isRightHandClick = true’. That would do the job. Thank you for sharing the videos!

  38. Hi Rumen,
    I have 2 things struggling with :
    first I need to apply hand rotation on the tracked body, and if I can track the finger, that would be great.

    Second, how to limit the kinect tracking body animation, to certain movement.
    or maybe to integrate with the “Per-Muscle Settings”
    http://i.imgur.com/h00kIfA.png

    I would be happy to participate on these 2 features, If you can guide me first on where to start from ?

    Best.
    Firas.

    • Hello Firas, there is ‘Allow hand rotation’-setting of the KinectManager, as well as a bone orientation filter. But as there are some updates lately, related to these features, please contact me first by e-mail. Then you will tell me how you’d like to contribute to these features.

  39. Hello again Rumen.
    Thank you once again for the quick reply.
    I comment out the blocks of code, where ‘isLeftHandClick = true’ or ‘isRightHandClick = true
    But it did the opposite.
    Now it doesnt click when i close my hand. It only clicks when i dont move my hand.

    Can you please tell me how to disable the click with the hand still, and only activate the click with the close hand?

    Thank you once again, and so sorry to bother you with this kind of problems.
    Hope you can help me.

    • Hello Ricardo, have you already modified the InteractionManager? Originally the click happens only when you don’t move your hand for about 2 seconds, as far as I remember. Please contact me by e-mail and send me your InteractionManager (or even better, zip a your or sample project, put it in Dropbos and send me a link), so I could check what is wrong with this “fix”.

  40. Hello again.
    I sent you by Wetransfer. I have several Scenes, but its the “Drum” scene.
    I have no more words to thank all the help you are giving to all a comunity of artists, developers, students…

Leave a Reply to Franz GopezCancel reply