Kinect v2 Examples with MS-SDK

Kinect2_MsSDKKinect v2 Examples with MS-SDK is a set of Kinect-v2 (aka ‘Kinect for Xbox One’) examples that use several major scripts, grouped in one folder. The package contains over thirty demo scenes.

Please look at the Azure Kinect Examples for Unity asset, as well. It works with Azure Kinect sensors (aka Kinect-for-Azure, K4A), as well as with Kinect-v2 sensors (aka Kinect-for-Xbox-One).

The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture-demos – how to use the programmatic or visual gestures, fitting room demos – how to create your own dressing room, overlay demos – how to overlay body parts with virtual objects, etc. You can find short descriptions of all demo-scenes in the K2-asset online documentation.

This package works with Kinect-v2 sensors (aka Kinect for Xbox One) and Kinect-v1 sensors (aka Kinect for Xbox 360). It can be used with all versions of Unity – Free, Plus & Pro.

Free for education:
The package is free for academic use. If you are a student, lecturer or academic researcher, please e-mail me to get the K2-asset directly from me.

One request:
Please don’t share this package or its demo scenes in source form with others, or as part of public repositories, without my explicit consent.

Customer support:
* First, please check if you can find the answer you’re looking for on the Tips, tricks and examples page, as well as on K2-asset Online Documentation page. See also the comments below the articles here, or in the Unity forums. If it is not there, you may contact me, but please don’t do it on weekends or holidays.
* If you e-mail me, please include your invoice number. More information regarding e-mail support you can find here.
* Please note, you can always upgrade your K2-asset free of any charge, from the Unity Asset store.

How to run the demo scenes:
1. (Kinect-v2) Download and install Kinect for Windows SDK v2.0. The download link is below.
2. (Kinect-v2) If you want to use Kinect speech recognition, download and install the Speech Platform Runtime, as well as EN-US (and other needed) language packs. The download links are below.
3. (Nuitrack-Deprecated) If you want to work with Nuitrack body tracking SDK, please look at this tip.
4. Import this package into new Unity project.
5. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’. The target platform should be ‘Windows’ and architecture – ‘Intel 64-bit’.
6. Make sure that ‘Direct3D11’ is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
7. Open and run a demo scene of your choice, from a subfolder of ‘K2Examples/KinectDemos’-folder. Short descriptions of all demo scenes are available here.

* Kinect for Windows SDK v2.0 (Windows-only) can be found here.
* MS Speech Platform Runtime v11 can be downloaded here. Please install both x86 and x64 versions, to be on the safe side.
* Kinect for Windows SDK 2.0 language packs can be downloaded here. The language codes are listed here.

Documentation:
* The online documentation of the K2-asset is available here and as a pdf-file, as well.

Downloads:
* The official release of the ‘Kinect v2 with MS-SDK’ is available at Unity Asset Store. All updates are free of charge.

Troubleshooting:
* If you get errors like ‘Texture2D’ does not contain a definition for ‘LoadImage’ or ‘Texture2D’ does not contain a definition for ‘EncodeToJPG’, please open the Package Maneger, select ‘Built-in packages’ and enable ‘Image conversion’ and ‘Physics 2D’ packages.
* If you get compilation errors like “Type `System.IO.FileInfo’ does not contain a definition for `Length’”, you need to set the build platform to ‘Windows standalone’. For more information look at this tip.
* If the Unity editor crashes, when you start demo-scenes with face-tracking components, please look at this workaround tip.
* If the demo scene reports errors or remains in ‘Waiting for users’-state, make sure you have installed Kinect SDK 2.0, the other needed components, and check if the sensor is connected.

* Here is a link to the project’s Unity forum: http://forum.unity3d.com/threads/kinect-v2-with-ms-sdk.260106/
* Many Kinect-related tips, tricks and examples are available here.
* The official online documentation of the K2-asset is available here.

Known Issues:
* If you get compilation errors, like “Type `System.IO.FileInfo’ does not contain a definition for `Length“, open the project’s Build-Settings (menu File / Build Settings) and make sure that ‘PC, Mac & Linux Standalone’ is selected as ‘Platform’ on the left. On the right side, ‘Windows’ must be the ‘Target platform’, and ‘x86’ or ‘x86_64’ – the ‘Architecture’. If Windows Standalone was not the default platform, do these changes and then click on ‘Switch Platform’-button, to set the new build platform. If the Windows Standalone platform is not installed at all, run UnityDownloadAssistant again, then select and install the ‘Windows Build Support’-component.
* If you experience Unity crashes, when you start the Avatar-demos, Face-tracking-demos or Fitting-room-demos, this is probably due to a known bug in the Kinect face-tracking subsystem. In this case, first try to update the NVidia drivers on your machine to their latest version from NVidia website. If this doesn’t help and the Unity editor still crashes, please disable or remove the FacetrackingManager-component of KinectController-game object. This will provide a quick workaround for many demo-scenes. The Face-tracking component and demos will still not work, of course.
* Unity 5.1.0 and 5.1.1 introduced an issue, which causes some of the shaders in the K2-asset to stop working. They worked fine in 5.0.0 and 5.0.1 though. The issue is best visible, if you run the background-removal-demo. In case you use Unity 5.1.0 or 5.1.1, the scene doesn’t show any users over the background image. The workaround is to update to Unity 5.1.2 or later. The shader issue was fixed there.
* If you update an existing project to K2-asset v2.18 or later, you may get various syntax errors in the console, like this one: “error CS1502: The best overloaded method match for `Microsoft.Kinect.VisualGestureBuilder.VisualGestureBuilderFrameSource.Create(Windows.Kinect.KinectSensor, ulong)’ has some invalid arguments”. This may be caused by the moved ‘Standard Assets’-folder from Assets-folder to Assets/K2Examples-folder, due to the latest Asset store requirements. The workaround is to delete the ‘Assets/Standard Assets’-folder. Be careful though. ‘Assets/Standard Assets’-folder may contain scripts and files from other imported packages, too. In this case, see what files and folders the ‘K2Examples/Standard Assets’-folder contains, and delete only those files and folders from ‘Assets/Standard Assets’, to prevent duplications.
* If you want to release a Windows 32 build of your project, please download this library (for Kinect-v2) and/or this library (for Kinect-v1), and put them into K2Examples/Resources-folder of your project. These 32-bit libraries are stripped out of the latest releases of K2-asset, in order to reduce its package size.

What’s New in Version 2.21:
1. Added ‘Fixed step indices’ to the user detection orders, to allow user detection in adjacent areas.
2. Added ‘Central position’-setting to the KinectManager, to allow user detection by distance, according
to the given central position.
3. Added ‘Users face backwards’-setting to the KM, to ease the backward-facing setups.
4. Updated Nuitrack-sensor interface, to support the newer Nuitrack SDKs (thanks to Renjith P K).
5. Cosmetic changes in several scenes and components.

Videos worth more than 1000 words:
Here is a video by Ricardo Salazar, created with Unity5, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3:

..a video by Ranek Runthal, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.3

..and a video by Brandon Tay, created with Unity4, Kinect v2 and “Kinect v2 with MS-SDK”, v.2.0:

806 thoughts on “Kinect v2 Examples with MS-SDK

  1. Hey Rumen, Thanks for you previous help. Rigging and model loading is working great but unfortunately now that I’ve updated to the most recent version the Kinect SDK doesn’t seem to be included in the project when I build the project?

      • Sure. Send me a message telling why you need the K2-asset, from your university e-mail address, to prove you are really a student there.

      • Hi Rumen,I am a student want to do some research about VR use the kinect2.0 and unity. so this package is very useful for me. I would leave my university e-mail address, if possible I can provide my student card to prove . Thanks

      • hi sir ,I am Unity Developer ,Doing developement from home can i get free Kinect-v2 asset thnks

      • Sure. If you are student – no problem. Just send me your email request from your university e-mail address.

      • Please read a bit, before reporting issues. Regarding speech recognition, see ‘How to run the demo scenes’ and Download-sections above.

  2. Could you instruct us of how to do the realtime avatar face scanning clearly using Kinect fusion function? Sample codes could be more appreciated

    • Unfortunately I can’t instruct you or provide you any code, because Kinect fusion is not (and never was) part of the K2-asset. Regarding the face scanning, I think you can use the face model mesh (available through the FacetrackingManager component/script). Check the 1st face-tracking demo, to see what I mean.

      • Hi Rumen, are there any instructions for getting it to work with a kinect v1? I see the interfaces and plugins for each…

        Not at the same time obviously, but I’m trying to make a generic app that can run with either of the sensors.

        I’ve tested the Kinect v1 with your older plugin as well as the MS sdk examples and the kv1 definitely works on all of them, but in Unity it only works with the v2, I even tried an empty project with just this plugin and the v2 works but not v1? Thank you so much!

        Off topic and if you’re busy ignore: but how the heck did you figure out how to load the plugins from zips in the streaming asset folder!? I don’t see that documented anywhere for Unity….it’s brilliant

  3. Hi Rumen

    I bought your Kinect version 2 api for Unity and I’m wondering what the absolute minimum of scripts, dlls, assets etc. would be to enable just your Speech Recognition API.

    For example if I were to just copy your Speech Recognition demo into a new Unity project, and even got rid of the robot avatar in that demo, what are the minimum files I would need to get just the speech recognition working (and the debug info showing up in that debug window).

    Thanks Rumen,

    Chris

      • Thank you very much Rumen! And thanks for developing the API.

        As a learning thing, I’ve also been trying to enable Kinect Speech recognition using just the Unity plugins rather than your API. I really like your API but I’m also interested in knowing how to enable speech in Unity without it. There’s almost no documentation about how to do this – do you know of any?

      • Well, Kinect speech recognition is not directly related to Kinect. It is more a wrapper over the MS Speech platform runtime. In this case Kinect acts just as a microphone to the system. So, what you need is to read the Speech platform documentation and use somehow its respective API in Unity. This is what Kinect2SpeechWrapper.dll does in the K2-asset. Hope this info is enough for a start.

  4. Hi Rumen,

    Do you have any plans on making the Kinect stream over TCP/IP to Unity. I think this would be great for virtual reality application.

    Roman

    • Hi, not yet but this is possible. In a similar way as the current save/play functionality in the player demo. My only problem is time. Please remind me in several weeks, in case I forget.

      • Hi Mr. Rumen, I had sent you a mail with my College ID card, for the free educational package which you said you would give if Mr Kevin Foley’s Comment was deleted. It has been deleted now, could you please provide me with the package. Thank You Mr Rumen.

      • Hi. Yes, indeed. Just saw the offensive comment of Mr. Foley was really deleted. Please send again an e-mail request from your university e-mail address, for the educational package you need.

  5. Hi Rumen, I would like to ask another question if you don’t mind.

    I’m trying to track a person using the kinect 2 but the data provided doesn’t quite fit with the expected results. For example, a tall person is showing to have 0.74 meters of height, measured at the shoulders. I’ve read at some point that the coordinates for joints are given in ‘kinect space’, but I don’t quite understand what that means.

    Do I have to apply some kind of conversion to translate the coordinates to meters in the real world? Or is there any other thing I’m missing?

    Thanks for your help.

    • Kinect space means a coordinate space (in meters), where the sensor is in the center of the coordinate system. There are 2 settings of the KinectManager, called ‘Sensor height’ and ‘Sensor angle’, which are used to convert Kinect coordinates to real-world coordinates. More information you can find here : http://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t13 Also, run the KinectOverlayDemo2 to see where exactly the shoulder joints are located. They’re inside the arms, not on top of the shoulders.

  6. Hello Rumen, I’m a student in Hochshule Bochum in Germany. I’m making a project now with Kinect v2 camera, to control a robot arm(al5a arm with ssc-32u) with my arm. But I meet a problem, that is how to measure the orientation of my arm with _JointOrientation and Vector4 in Kinect.h. I have found that we can use boneOrientation in SDK 1.7 or earlier version. However I can’t solve the problem with SDK 2.0. Could you please tell me? Your answer will be appreciated.

    • Hi, if you use Kinect v2, you need Kinect SDK 2.0, not 1.7. I have actually never used the bone orientations that come with the SDK. As far as I rememember though: 1. this Vector4 is not a vector, but a quaternion (rotation value); 2. you should first orient your robot’s arm up and then apply this rotation. Please email me, if you have further questions.

  7. Hi Rumen, Thank you for great assets.
    I have a question.
    When I was in and out repeatedly from the tracking area, the initial position of avatar is not correct.
    Initial joints position value of avatar are incorrect when moverate value is over 1.
    I changed this.

    [ AvatarController.cs(about line 610)]
    =====================================
    xPos = jointPosition.x * moveRate – xOffset;
    float yPos = jointPosition.y * moveRate – yOffset;
    float zPos = !mirroredMovement ? -jointPosition.z * moveRate – zOffset : jointPosition.z * moveRate – zOffset;
    =====================================

    Is this correct?

    • Hi Teruaki, I suppose you mean the beginning of Kinect2AvatarPos()-function. The line numbers are different in my source, due to some later changes maybe. I think your fix is right. Have you checked, if the avatar’s position is correct, with different values of Move-Rate?

      • Thank you for your reply.
        Yes, it is in the Kinect2AvatarPos()-function.

        If Move-Rate is 1, Avatar’s initial position is correct.
        If Move-Rate is 5, Avatar’s initial position is incorrect.
        Maybe when Move-Rate is over 1, Avatar’s initial position is incorrect.

      • In my source it looks like this:

        float xPos = (jointPosition.x – xOffset) * moveRate;
        float yPos = (jointPosition.y – yOffset) * moveRate;
        float zPos = !mirroredMovement ? (-jointPosition.z – zOffset) * moveRate : (jointPosition.z – zOffset) * moveRate;

        I didn’t have time this week to check it thoroughly, but it looks correct to me. Maybe you are using an old version?

      • Sorry for the my description.
        Yes, I updated your source code.
        Because If Move-Rate over 1, Avatar’s initial position was wrong.

        After update my source code, It works correctly.
        This is like a bug report.
        Thank you.

  8. Hi Rumen, Thank you for great assets.
    i am working with unity and kinect v2 and i have a question…
    is it possible to take a photo or video when i press a keyboard, and save it?

    • Hi Felipe, Yes, sure. KinectManager.Instance.GetUsersClrTex() will give you the current color texture. Then use the Texture2D.EncodeToJPG() to encode it, and finally save the resulting bytes in a file with File.WriteAllBytes().

  9. Hi Rumen, I have a question, is it possible to get the arms (or any bone) rotations around the forward axis of itself, it seems that by default this rotation is not tracked by your avatar controllers, the rotation I’m talking about is this:
    http://media.trusper.net/u/5fcda39e-f519-42ce-9cf0-0c1574827f1a.jpg

    I know that this is possible even with kinect v1 because I saw it done on an older kinect plugin for Unity, the zigfu plugin, do you know how to implement this? thanks!

  10. Hi,
    Thank you for your posts, they are always useful. At this moment, I’m doing a project for the university and your code could really help me.
    How can I get the code?
    Thank you very much,
    Elisa

  11. Hi, I’m a student and working on my research. I actually want to send you email to get the free asset, but I just don’t know where to find your email address.

  12. Hi Rumen =)

    I’ve a question… If I move an avatar to the left or to the right, finally it is leaving the scene… How can I avoid this? Is there any option in your library? Thanks one more time for your work! It’s really amazing.

    • Hi, I don’t quite understand your question. If the avatar is out of the scene, but the user not yet lost, you can move or rotate the main camera. If the user gets temporarily lost, there is a setting of KinectManager called ‘Wait time before remove’ (in seconds). Do mind the sensor must re-detect a user with the same userId within the specified time, in order this to work. Otherwise, if it detects a new user (userId), you can modify CalibrateUser()/RemoveUser()-functions in KM, to get a higher level matching functionality, probably by looking at the positions of the lost and newly found user. Hope these tips help a bit 🙂

      • I see. You can enable ‘External root motion’-setting of the AvatarController-component and then move the avatar with your own script within the limits. Or modify the MoveAvatar()-function in AvatarController to respect the scene limits.

  13. Hi there, I bought your plugin despite the fact that I am a student. Your work is awesome 🙂

    I have both devices and sometimes I need to switch to the previous version. Is there any way to force your plugin to use the Kinect 1? It seems to chose depending on the installed SDK. Since I have installed both it choses the Kinect2 despite the fact that it is not plugged…

    Thanks for the support!

  14. Hello..

    We are testing this asset purchased from account sumit.solanki@hotmail.com..

    I am getting this console debug message on all demos:
    K2-sensor opened, available: False


    K2-sensor closed, available: False

    It does not start the sensor does not function. It works on your other free plugin but not paid. We are using kinect v2 and everything is installed.

  15. Hello..

    i am testing this asset purchased from account hamid.bilal@ringme.com
    I am getting the following error

    Assets/KinectScripts/Interfaces/Kinect1Interface.cs(569,64): error CS1061: Type `System.IO.FileInfo’ does not contain a definition for `Length’ and no extension method `Length’ of type `System.IO.FileInfo’ could be found (are you missing a using directive or an assembly reference?)

    how i can solve it?

  16. Hi Rumen,
    I want to purchase this but I have unity 4.6 I wonder if I can play this with just the free 5.0.1 version of unity or I need pro?
    If I need pro could you please share a link to the old version? Thanks.

    Have a great day.

    • Hi, NONE of my Unity assets have ever required the Pro-version, in order to work. Just see the asset description: “This package works with Kinect v2 and v1, supports Windows 32- and 64-bit builds and can be used in both Unity Pro and Unity Personal editors.”

  17. Hi Rumen,
    I’m trying to export a .exe from the project, but when I run it, in the log is writing this error many times:

    “Fallback handler could not load library C:/Users/name/Desktop/Kinect/kinect_Data/Mono/Kinect10.dll”
    and
    “Fallback handler could not load library C:/Users/name/Desktop/Kinect/kinect_Data/Mono/libKinect10.dll”

    I’m using Kinectv2 (kinect20.dll). What should I do? I think that all resources are copied in my project.

    Thanks so much!

  18. Hi Rumen,

    i have a question and forgive me if it was asked before but what examples for VR and mobile can be available on request? I’d love to add kinect support for a multiplayer mobile VR experience i am working on. Thank you so much in advance,

    • They’ll be available soon on Unity asset store. But if you want to check them out in advance, contact me by e-mail and mention your invoice number.

  19. Hello Rumen,

    I have bought your package and it’s so awesome 🙂

    But I have a question. Is it possible to untrack some parts of the body at runtime?

    Thanks a lot for your help.

  20. Pingback: Kinect v2 Mobile & VR Examples | RFilkov.com - Technology, Health and More

  21. Hi.i am a student who are doing some research on kinect2 ,i really need this package to further my work,can i get it for free?Great thanks to your work! This is the homepage of my university http://www.bupt.edu.cn/
    ,and my email address is here:mzengxia@gmail.com.

  22. Hello Rumen,

    thx alot for your awesome package.
    We are working on a 2 player game for kinect right now and we are stuck with a playerindex problem.
    We want to limit the maximum player ID’s that are given out to two, otherwise when a player leaves the scene and jumps back in, he gets the wrong player index and is therefore unable to control the player.

    I would be very glad if you could help us with that issue.
    Thx alot and best regards,

    Luca, Max, Friedrich

    • Hi, I think you should modify the GetEmptyUserSlot()-function in KinectManager.cs, so that it matches your requirements – two users & same indices/slots. See this tip, if you need an example: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t23 There is also a setting of KM called ‘Max tracked users’, but it will only limit the number of users. In your case you could limit the number of users in the function, by returning -1 for non-valid users. All this may sound complicated, but actually it isn’t. You will see… 🙂

  23. Hi Rumen!

    Thanks again for your continous improvements in your kinect assets!

    I want to ask you: it’s possible to have full body tracking and face tracking simoultaneously in one single 3d avatar? How can I do this?

    Thanks in advance for your help!

    Best regards,

    Cris.

    • I would combine the FacetrackingManager from FacetrackingDemo1 (or FT-Demo4), i.e. the respective settings and components, with the avatar tracking in FittingRoomDemo2. Hope I understood your question correctly.

  24. Hi,
    i am a student and i have to do a project involving face detection.
    How can i get the code?
    Thank you

  25. Hi Rumen F,

    Great work on the K2 package! It really helped us out a lot with our Kinect projects.

    I’m having an issue with the package version after trying to import it into a new project after a few months. The version I’m getting seems to be stuck on 2.6, whereas the official one is now 2.9 (2.9.5). I have looked through your documentation/troubleshooting and was unable to find any solutions as to why I’m facing this issue.

    It would be of much help if you could get back to me with a solution! Thank you!

    • Hi Justin,

      I don’t quite understand your issue. Do you import the package from the Unity asset store or from somewhere else? The K2-asset on the Asset store is currently v2.9.5, and it requires Unity 5.3.0.

      • Hi Rumen,

        I am importing straight from Unity editor Asset Store (Ctrl+9) with the account that bought the K2 asset. The header however had a 2.6 appended to the end; the same version I bought in 2015 and it seems to always be that version. Even the imported change log shows latest is 2.6.

      • This looks like an asset store issue to me. Why don’t you contact the asset store team and report about this strange behavior. If you need the latest K2-asset right away, please send me an e-mail and mention your invoice number. Then I’ll send you the K2-package directly.

  26. Hi rumen i just bought your packet. when i try the facetrackdemos4 and first try it sussces but freeze, the second time it gives error Skeleton Data Smoothing failed. Im using kinectv1 and sdk 1.8.

    • Hi, I’ve never heard issue like yours so far actually. Are you using Kinect v1? Make also sure your graphics card drivers are up to date.

  27. Hi Rumen, there is a bug on the 2.9 version, KinectManager.IsKinectInitialized() returns always true even though I don’t have any kinect devices connected to the computer, there is some log thrown to the console at the beginning of the app start:

    K2-sensor opened, available: False
    UnityEngine.Debug:Log(Object)

    Interface used: Kinect2Interface
    UnityEngine.Debug:Log(Object)

    I use KinectManager.IsKinectInitialized() or KinectManager.Instance.IsInitialized() to avoid doing some stuff in my code and also warn the user of the kinect being unplugged, but as these methods are always returning true despite of wheter kinect is present or not, my application brokes, is there any way to quickfix this or a workaround? did you resolve this on 2.10?

    thanks!

    • I’ve found that in Kinect2Interface there is this:
      // change this to false, if you aren’t using Kinect-v2 only and want KM to check for available sensors
      public static bool sensorAlwaysAvailable = false;

      I’ve changed it and it worked but now the singleton for the kinectManager never gets assigned because the line

      // set the singleton instance
      instance = this;

      is written after the exception handle code that returns the method if something went wrong, so now every line of code that tries to use KinectManager.Instance will thrown null reference exception, I think this should’nt happen because the fact that the kinect isn’t available doesn’t mean that the KinectManager is not present on my scene, in fact it is present, but I can’t access it in a singleton fashioned way like KM.Instance, a quick fix could be to put the “instance = this” line before everything on the Awake() of the KinectManager.cs, but I don’t like to modify third party assets code in order to avoid having to do it with every new update, also I don’t know if I’m breaking something else, what do you suggest?

      • Ah, you found sensorAlwaysAvailable! Yes, I think you can move ‘instance = this;’ at the start of Awake() in KM. I’ll do the same and check next week, if it eventually breaks anything, but I think it won’t (I’m on a short trip till end of this week). The instance initialization is there for historical reasons. First I only checked if (KM.Instance != null) and later for (kinectManager && kinectManager.IsInitialized()). Please just try and tell here, if it still works. If it does, I’ll change it too, and you will not need to change anything in future updates. Thank you for the feedback!

      • yeah that’s the solution I’ve found, thanks for the reply, still I’m having the problem with the singleton instance, but nevermind, I checked for null on every line I used the kinectmanager instance, maybe you can consider my suggestion of putting the “instance = this” line of kinectManager on the Awake() method so it is consistent with the singleton idea of having a reference to the instance in the scene and that way the code doesn’t need to have null-checking everywere. thanks again Rumen!

  28. I still got issues making it work. My charakter always pulls his legs up isntead of crouching… and I don’t get the “muscles” menu, is there a tutorial soemwhere?

    • Sorry, I had a short trip last Friday to Sunday and could not respond earlier. I looked twice at the project you sent me. It doesn’t seem to use any script or component of the K2-asset you’re commenting here. I’m not sure why you ask me for help in this case?!

  29. Hi Rumen,
    Your assets worked amazingly as I have used them in many of my projects. Rumen, I have a question regarding Kinect Fitting Room Demo 2. I want to use fitting room with background removal. it will be great if I you can help me with the steps to achieve that.

    Thanks
    Aarti

  30. Pingback: Kinect for Windows v2 with Unity VR Examples – Be Analytics

  31. Hi Rumen,

    Loving the asset so far! I have one issue that keeps coming up during builds/compiling regarding shader compilation.

    Shader error in ‘Custom/UserBlendShader’: ps_2_0 does not support indexing resources at line 70 (on d3d9)

    Compiling Fragment program
    Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING

    and

    Shader error in ‘Custom/UserBlendShader’: ps_4_0_level_9_1 does not support indexing resources at line 70 (on d3d11_9x)

    Compiling Fragment program
    Platform defines: UNITY_NO_LINEAR_COLORSPACE UNITY_ENABLE_REFLECTION_BUFFERS UNITY_PBS_USE_BRDF3

    How should I go about fixing these issues?

    Thanks,
    Justin

    • Hi Justin, hm it looks weird. Which version of Unity do you use? What I see so far is that the error is generated by d3d9. Make sure you have Direct3D-11 and it is the 1st option in ‘Auto graphics API for Windows’-player setting. Otherwise, just delete the shader.

  32. when you start the Avatar-demos, Face-tracking-demos or Fitting-room-demos, this is probably due to a known bug in the Kinect face-tracking subsystem,Test is only when there is something wrong with the system for win10. Maybe you can check out from the aspects of this is not an issue of NVidia drivers, and hope to help.

  33. Hi Rumen. Thanks for your great asset. I have a question. It is possible to detect when user is smiling? Maybe you have any idea how create simple smile detection?

    • Hi, the simplest way would be to get the Happy-face property. I mean something like this:

      KinectInterop.SensorData sensorData = KinectManager.Instance.GetSensorData ();
      Kinect2Interface k2int = (Kinect2Interface)sensorData.sensorInterface;

      for (int i = 0; i < sensorData.bodyCount; i++)
      {
      if(k2int.faceFrameSources != null && k2int.faceFrameSources[i] != null && k2int.faceFrameSources[i].TrackingId == (ulong)primaryUserID)
      {
      if(k2int.faceFrameResults != null && k2int.faceFrameResults[i] != null)
      {
      Windows.Kinect.DetectionResult smileStatus = k2int.faceFrameResults [i].FaceProperties [Microsoft.Kinect.Face.FaceProperty.Happy];
      debugText.text = "Smile-status: " + smileStatus;
      }
      }
      }
      "

      • Hi Rumen. I have a question. It is possible to get user face with background removal?
        In my case I need to get User face only and save it as png with transparency.
        It is possible with your package?

    • Sorry, I forgot to mention you need to uncomment the ‘| FaceFrameFeatures.Happy’-line in InitFaceTracking()-function of KinectScripts/Interfaces/Kinect2Interface.cs. Otherwise the smile status will not be detected.

  34. Hello Rumen,

    Thank you for you awesome assets. I am new for unity and Kinect. I am student. I want to develop interactive game for school project. I tried to use the Avatar from the unity free asset. But the avatar couldn’t run. I got a message “please check the kinect SDK installation”. I am using unity 5 and Kinect v2. Would you help me, please ?

    Thank you !

    • Hi, the free Kinect asset works with Kinect-v1 only. For Kinect-v2 you need SDK 2.0 and the K2-asset (see the article above). If you are a student, you’re eligible to get this asset free of charge. Just e-mail me next week from your university e-mail address, and I’ll send you the package.

    • You cannot pause or resume the Kinect. But you can stop or start it. If you have some scenes that require Kinect, and some that don’t, just comment out ‘#define USE_SINGLE_KM_IN_MULTIPLE_SCENES’-line at the start of KinectManager.cs, and then use the KM only in the scenes, where you need the sensor. The other option is to call OnDestroy() and Awake()-methods of KinectManager from your script, in order to stop or start the sensor when needed.

Leave a Reply to JoycelynCancel reply