KinectExtras with MsSDK is an extension of the Kinect with MS-SDK package. It provides additional examples and demos, based on Kinect SDK that are not directly related to transferring human motion onto digital characters. The package consists of Kinect Interaction demo presenting hand cursor control as well as hand grips and releases, Speech recognition example, Face-tracking example and Background-removal demo. It is possible to integrate all Kinect-related features in one project, i.e. to make KinectExtras and KinectManager (from ‘Kinect with MS-SDK’-package) work together.
This package is for Kinect v1 only. It supports 32-bit and 64-bit builds and can work in both Unity Pro and Unity Free editors.
Free for Education:
This package is free for academic use (i.e. for schools, universities, students and teachers). If you match this criterion, please contact me by e-mail to get the KinectExtras asset directly from me.
Unity 5.0 and later introduced an issue, which causes in many cases Kinect tracking to stop after 1-2 minutes with DeviceNotGenuine error. As a result you will see that the Kinect-enabled game freezes. There is no definite workaround yet. My advice is: If you encounter this issue, please install Unity 4.6 along with Unity 5 (just in another folder) and use the K1-assets in Unity 4 environment. See the updates here.
How to Run the Examples:
1. Download and install Kinect SDK 1.8 or Kinect Runtime 1.8 from here.
2. Download and import the KinectExtras-package.
3. Open and run the scene of your choice, as described in the corresponding Readme pdf-file, located in the Assets-folder.
How to integrate KinectExtras with KinectManager:
To integrate the KinectExtras with the KinectManager (from ‘Kinect with MS-SDK’-asset), do the following:
1. Open KinectScripts/KinectWrapper.cs and near its start uncomment: #define USE_KINECT_INTERACTION_OR_FACETRACKING.
2. To integrate Kinect speech recognition with the KinectManager as well, uncomment also: #define USE_SPEECH_RECOGNITION.
3. Open KinectScripts/InteractionWrapper.cs and near its start uncomment: #define USE_KINECT_MANAGER.
4. Open KinectScripts/FacetrackingWrapper.cs and near its start uncomment: #define USE_KINECT_MANAGER. If you plan to use face-tracking in the scene, don’t forget to enable the ‘Compute color map’-setting of KinectManager-component. It is needed for the face-tracking to work.
5. If you have uncommented ‘USE_SPEECH_RECOGNITION’, open KinectScripts/SpeechWrapper.cs and near its start uncomment: #define USE_KINECT_MANAGER.
6. Make sure that both KinectManager and the Extras’ managers, like the InteractionManager, FacetrackingManager or SpeechManager are components of the MainCamera (or other persistent game object).
7. Open KinectScripts/KinectManager.cs and at the start of its Awake()-method add this line: ‘WrapperTools.EnsureKinectWrapperPresence();’, in order to ensure the presence of the needed native libraries. Also, make sure that the Assets/Resources-folder from KinectExtras-package exists in your project.
8. If you use only the KinectManager-component (but not any of the KinectExtras-managers) in the scene, Open KinectScripts/KinectManager.cs and at the start of its Update()-method add this line: ‘KinectWrapper.UpdateKinectSensor();’. Normally, the KinectExtras-managers do that for you.
9. Run the scene to check if the integration works properly and without errors.
The official release of ‘KinectExtras with MsSDK’-package is available at Unity Asset Store.
The project’s Git-repository is located here. This repository is private and its access is limited to contributors only.
* If you get DllNotFoundException or other initialization errors, make sure you have installed the Kinect SDK 1.8 or Kinect Runtime 1.8.
* Kinect SDK 1.8 and tools (Windows-only) can be found here.
* The examples were tested with Kinect SDK 1.7 and 1.8.
* Here is a link to the project’s Unity forum: http://forum.unity3d.com/threads/222517-KinectExtras-with-MsSDK
What’s New in Version 1.8:
1. Added support for x32/x64 architecture at runtime.
2. Improved face tracking – fixed native routine responsible for skeleton selection.
3. Updated mouse cursor control to match the game window.
4. Updated speech error handling to display messages instead of error codes.
5. Renamed Extras-folder to Samples to match the respective folder of “Kinect with MS-SDK”
Playmaker Actions for ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’:
And here is “one more thing”: A great Unity-package for designers and developers using Playmaker, created by my good friend Jonathan O’Duffy from HitLab-Australia and his team of talented students. It contains many ready-to-use Playmaker actions for Kinect v1 and a lot of example scenes. The package integrates seamlessly with ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’-assets. I can only recommend it!
164 thoughts on “KinectExtras with MsSDK”
HI~! I use “Kinect with MS-SDK”. but when I build my game to Stand-Alone(Window), dont act Avatar with my Character. but your character in Asset Scene act. two setted same. I tried the manual. Am I need some Setting??? I need your email…. My english is bad. sorry; My email is firstname.lastname@example.org. email please.
I’m sending you an e-mail now. Please zip your Unity project, put it in Dropbox (or on other web service/cloud) and send me a link to it, so I could download it and take a look what exactly went wrong with your Avatar.
Pingback: Kinect with MS-SDK | The Engineer's Blog - R. Filkov
I am trying to run the demo scenes included but get the errors below. I have the sdk installed and followed the instructions on the pdf instruction files included with the download from the asset store. Any help woul be greatly appreciated.
NuiInitialize Failed – Device is not connected.
KinectManager:Start() (at Assets/KinectScripts/KinectManager.cs:857)
System.Exception: NuiInitialize Failed
at KinectManager.Start () [0x00023] in C:\Users\Skynet06\Documents\Kinect for Meds\Assets\KinectScripts\KinectManager.cs:717
KinectManager:Start() (at Assets/KinectScripts/KinectManager.cs:858)
This happens sometimes when you connect the Kinect sensor after you’ve opened the Unity project. Please restart Unity or reopen the example project and try again. Send me an e-mail if the error persists,
I have the same problem. I’ve opened Unity before I’ve connected the Kinect. I’ve check the device manager and I’ve all drivers installed.
The sensor should be connected and working before you start Unity. Why do you connect it afterwards?! Both Kinect-with-MSSDK and KinectExtras-packages were tested 100s of times already.
Pingback: Kinect to Playmaker | RFilkov.com - Technology, Health and More
Hi from Colombia,
I’ve been looking for some examples for speech recognition with kinect: in the university (Universidad Manuela Beltrán) we are trying to develop a game for special children; and you are the only one (on my path) who have done that, examples, if you clould give me some hints or advices about that I would really thank you.
My e-mail: email@example.com / firstname.lastname@example.org
Cesar U Espitia V.
Sorry for my english.
Hi Cesar, if you plan to use Unity for your game, please send me a e-mail and I’ll send you back a free link to the KinectExtras-package. You may use it then in your game.
Hi. How do you use the BoneOrientationsConstraint script.
I am using this script because some users may have mobility constraints and I have been trying to figure out
how to allow normal character motion based on limited human motion.
The goal of this script is to prevent (as far as possible) accidental incorrect movements of some body parts, like lower part of the leg (below the knee) moving forward, etc. It is turned on by default, but may be switched off by a setting of KinectManager , which is component of the MainCamera.
If those people have disabilities, I’d suggest that you to fix the respective bones’ rotations at the avatar’s skeleton and then remove these bones from Kinect control – see the bone-parameters of AvatarController – component of the avatars in the example scene.
Thank you Rumen.
I do have another Question about the Character Rig.
I created a character and Rigged it. I threw on the Avatar Controller Script on it. The kinect allows me to
control it but does not allow it to step forward or backwards. In other words the depth does not work on my character like it does on yours.
Does the Rig need to be set up identical to yours for it to work?
Hi, in order to make the avatar move, you need to set the Root-parameter of AvatarController. The parent of this root bone in the skeleton is used to move the avatar. Otherwise, just modify the script to reflect your rig. Hope this helps.
thanks for making these great assets. They seem really cool and useful.
I am trying to use the facetracking feature of KinectExtras to add some immersion to the rotation of a first person camera. It works quite alright but yet not! 🙂
The problem is that the rotation around the up-down axis (y in Unity) is mirrored (when I turn the head left the camera turns right). And the rotation along the front-back axis (z in Unity) is slightly tilted (maybe 15 degrees). Can you please point me in the right direction to (1) where in the scripts I can mirror the rotation of the Y-axis and (2) add or subtract 15 degrees of the z-axis?
To your question:
Open KinectScripts/FacetrackingManager.cs and search for: ‘FacetrackingWrapper.GetHeadRotation(ref vHeadRot)’. Below this line you can adjust the head rotation, as to your needs.
Thank you for pointing me out that face tracking could be used for first person camera-rotations as well! 🙂
Got it working now.
I was actually messing with that exact part of the script before you replied it just never occured to me that I could insert a line about the y-rotation as well! So with your reply and knowing that was the part of the script to tweak I just inserted a vHeadRot.y = -vHeadRot.y; and voila! 🙂
Thanks a lot Rumen!
Hi Henrik, Do you know where can i find the rotations angles?
Pingback: Kinect with MS-SDK | RFilkov.com - Technology, Health and More
When i import “kinect extra” to “Kinect with MsSDK”, the error was occurred. The error message was shown below. i can run the unity without the kinect extra.
at (wrapper managed-to-native) InteractionWrapper:InitKinectInteraction ()
at InteractionManager.StartInteraction () [0x00019] in C:\Users\AndyChung\Documents\FinalProject\Assets\KinectScripts\InteractionManager.cs:257
InteractionManager:StartInteraction() (at Assets/KinectScripts/InteractionManager.cs:269)
InteractionManager:Update() (at Assets/KinectScripts/InteractionManager.cs:330)
First of all, as far as I see, you didn’t import the KinectExtras-package, but just copied its scripts into your Assets/KinectScripts. This is not enough. You need to copy Resources-folder as well, from the Assets-folder of KinectExtras to your Assets-folder. It contains libraries and resources, needed by KinectExtras-scripts, including the missing KinectUnityWrapper.dll, which causes the error in your case.
Moreover, if you want to make KinectExtras and KinectManager work together, pay attention to ‘How to Integrate KinectExtras with the KinectManager’-section in the article above.
Hi, I want to access to the latest updates in GitHub, My email address is email@example.com, I am trouble in Speech Recognize now, Thanks foe your help!
the KinectExtras-project is on BitBucket. Unfortunately there are only 5 user seats available, including me, and all of them are busy at the moment. What’s the trouble with the Kinect speech recognition?
I am developing Kinect speech recognition in unity3d, I write a plugin program to use the NUI function. However, I find it not accurate as the Speech Recognition Sample provided by Microsoft (Speech Basics-WPF C# Sample in KinectToolKit 1.8).So I hope you can tell me your way to encapsulate the NUI function.Thanks!
I find the Kinect speech recognition example quite accurate, but anyone can have his own opinion. In this case, you can start with the accurate MS example and write your own most accurate plugin.
If you need the sources of my native wrapper, send me an e-mail.
Pingback: Kinect – Sim Planting v0.3 | Kennedy Wong
my name is jose ignacio student last semester of college, I’m currently developing a project to handle objects with kinect, sent this message to know if I could pass your KinectExtras project to better understand some things.
My email is firstname.lastname@example.org, I await your response.
Ok, I’ll send you the package. Anyway, it’s easier for me to respond, if these messages come as e-mails.
Hi, what is the difference if I use OpenNI & MsSDK? I saw your other tutorial but is based on OpenNI. I am actually doing a game for my school project. So it would be great if you have any tips on how to start.
OpenNI is supports multiple sensors on different platforms, but is more difficult to configure and use. Moreover, after Apple bought Primesense, the OpenNI-website and software was shut down. If you’re new to Kinect, I’d recommend you to start with Kinect SDK. It is easier to use and works out of the box. Unfortunately it is Kinect-oriented and Windows-only. Hope this info helps for a start.
Thanks for the info about OpenNI. Yes I am working on window 7 x64. So is recommended that I should start with this “KinectExtras with MsSDK” tutorial and kick start with my project? Because I am planning to create use a 3d character to mobile around a scene. Is it possible to create a 2d background with 3d character?
Besides, I would also want to use the avatar to grab & drop an object. Is it possible to be done?
This is my email email@example.com. I would like to have the KinectExtras-package. 🙂 Thanks
Please send me an e-mail, if you want to get the KinectExtras-package. Otherwise I cannot be sure this is really your e-mail. Thanks.
I am Chinese, but my English is not better. I have a question to ask you. Thank you.
I want know how to use kinectgesture mouse to click the GUI button. Because kinectgesture mouse is GUITexture. GUITexture can’t interact with GUIButton.
I use the Kinect-Extras unitypackage, and I try the “GuiWindowScript” Script to scene and play, but it’s not responsed. And I update this script, not responsed.
Please E-mail to me: firstname.lastname@example.org, Thank you.
Hi there, you can enable the ‘Control Mouse Cursor’-option of KinectManager. KinectManager is a component of the MainCamera in the example scenes. Then disable the HandCursor game-object, if you like.
Hello, sorry for my bad English, I speak Spanish, I am developing a project and I wonder if you may send me your the KinectExtraspackage, I would greatly appreciate. *My email: email@example.com .Thank you
Hello, I only send KinectExtras or other Unity packages, after I receive e-mail. This is the way to prove the given e-mail belongs to you.
I have a question about KinectExtras with MsSDK. Can i make interface which controls by hand? I want create buttons like KinectTileButtons in MS SDK 1.8 and handle Click, Enter, Leave and Grip/Release events by hand like its provided KinectRegion? This package has this capability?
Hi, you have two options. The first one is to control the mouse with your hand grips and releases and that way interact with the GUI. The second option is to implement your own processing of events, depending on the status of each hand during interaction. Please send me an e-mail, if you want to try the package by yourself.
KinectExtras is super great. It makes controls in our project much easier and intuitive.
I have a problem with build however. When I use InteractionManager, device is not recognized when the game is started from build and quits. It is okay in editor though. I used steps you described in “Using Kinect across multiple scenes” pdf with StartupScene and KinectObject, but unfortunetely it doesn’t help. I also tried to start the built game with editor closed, also with no luck. This problem happens only when InteractionManager is enabled.
The interesting thing is, when I use “Build and run”, so the game starts right after build is finished, this problem doesn’t appear and everything works fine.
Do you have an idea, why this happens?
Thanks for your reply.
You can see in the build’s log-file what went wrong. I suppose you missed to copy the Resources-folder from KinectExtras-package into your project. The libraries needed for the app to run are copied from there to the root folder of your build. In the editor, the libraries already exist in the root folder of your project. Hope this info helps.
Yeah, that helped :). I moved those libraries into subfolder of Resources, so my mistake.
Thanks a lot. Keep up the good work ;).
Hello! I just bought kinect extras in unity asset store, I wanted kinect to track body an face at the same time , but the body goes slow when I try to combine scripts in the same scene. Is it possible to combine them?
Yes, you can combine them by following the steps in ‘How to Integrate KinectExtras with KinectManager’-section above. Greetings!
Thank you Rumen! It works great!
I just have a question, I’m trying to track eyelids but it seems not to do it properly(it gives 0 value most of the time). Any idea of why is this happening?
Hi Urban, as far as I remember eyelids’ AUs work only with ‘Kinect for Windows’ sensor. Cannot test it at the moment, but maybe this is causing the problem. Greetings!
Thank you very much Rumen. I’m planning to get kinect for windows in order to have face tracking working properly. -with eyelids- But I saw kinect v2 is even cheaper right now.
I’m doing a research by myself and I don’t want to spend without sense. Do you have an unity example of kinect v2 facetracking working in unity? If you have it and can share it, I will decide for kinect v2.
Answered by e-mail. Generally, the face-tracking is not yet available in the K2-Unity package.
I have a question.
first, I’m not fluently speaking english. sorry.
when Building unity, SpeechManager.grxml is not created in desktop.
(but, some dll files are created.)
so, I can’t execute my program.
please send email to me. thanks.
There must be SpeechGrammar.grxml grammar-file in Assets/Resources, BUT named SpeechGrammar.grxml.txt. This file will be copied at run-time to the root folder of your build as grammar (grxml) file. Probably you missed this.
Hello Rumen, I purchased the KinectExtras with MsSDK yesterday and must say this is awesome, I am using the Face Tracking sample and its works wonderful. I need help on one thing: Can I change the color resolution to 1280×960 in the face tracking sample video feed and how?
Hello Mona, unfortunately the resolution 640×480 is fixed in the native wrapper. You can send me an e-mail, if you like to get the wrapper’s sources and modify them according to your needs. Greetings!
I also tried to integrate, with Kinect Manager.When I un-comment the line: #define USE_KINECT_INTERACTION_OR_FACETRACKING, The sample overlay doesn’t work. Can you suggest how to fix it.
I’ll take a look at this issue after my holidays. Please remind me in case I forget. Also, please send me an e-mail, if you want to get the wrapper source. Otherwise I cannot be sure it is your e-mail.
Please, I do not have access to your repository.
You do not have access to this repository.
Use the links at the top to get back.
Send me per email this project, appreciate the help.
Yes, the repository access is for contributors and donators only. Please you send me an e-mail, stating why you ask for free asset. Otherwise, the package is available at the Unity asset store.
First of all, I would like to say that your free asset at Unity Asset Store is helping me a lot. I’ll probably buy the Kinect Extras for more examples/future projects that I’m planning.
But as I’m very new to Kinect in Unity, I have some simple questions that I’m not being able to solve:
– How do I change the BGImage to show the silhouette of the person only (it will show the person itself, clothes, etc)?
-There is any way to make this silhouete colide with some particles that I’ll put on the scene?
-And, my final questions is: the Z axis. If I put some objects on the scene, can I easily change the Z axis of the silhouete depending on the depth of the person, so the person can be in the front/back of these objects?
(bad english, sorry)
Well, if you could help/explain to me how can I do that, I’ll be very thankful.
Here is my e-mail: firstname.lastname@example.org
The BR-image is a 2d color image, containing the user. The pixels displaying the user are opaque (with alpha>0). All the rest pixels are transparent (having alpha=0). To create a silhouette out of this, you’d need additional image processing. You can do it in your script or use a 3rd party library, like OpenCV for instance.
To make the image collide with something on the scene, you need to convert this 2d image to 3d, by using the depth info and creating 3d mesh out of information contained in these two images. Then simply put this mesh in the scene and wrap it in a collider. Hope this information helps.
Hi Rumen, it’s me again, Matheus,
I did a few changes on your OverlayScene to test a few things.
I’ll need that the “player” walks on the scene and it will be some 3D objects that he will be able to interact with (just touch and the touched object will do something).
What I thought that might be work is: work with 2 objects rendering 2 different textures: 1 that shows the background and the player on a Plane or a Cube, and between this object and the camera, a Plane, rendering only the player.
Also, to the player be able to touch these objects, I thought to track the hands joints and “attach” an overlay object that will follow them and when it collides with any object, do something (simple collision test).
Do you think that this could work? If not, do you have any idea how could I do that?
Also, I’m having the following problem:
When I put the texture that only shows the ‘player’ but not the ‘static background’ into a 3D object(Plane, Cube, etc), this “static background” becomes black and only the player is rendered. Could you tell me how do I change this ‘black’ color to become transparent?
I hope you can help me with that.
Here is my contact information: e-mail: email@example.com / Skype: matheus.fusco
I suggest you need to change the shader of the plane or cube to ‘Transparent/Diffuse’ in order to show only the player there. Of course, the rest of the pixels in the player-only texture need to have alpha=0.
I couldn’t quite understand what exactly you’re going to do, but I think you are on the right track. Just use the overlay example and add colliders and/or rigidbodies to the 3d-objects in your scene to detect the interaction.
Hi Rumen F,
I’ve been trying to implement motion, voice and face detection, to my project. I wonder if you may send me your the KinectExtraspackage, I would greatly appreciate. this is my email “firstname.lastname@example.org”
Thank you so much.
I don’t send e-mails, but only reply to the e-mails I receive. Greetings!
I’m not being able to track the joints of 2 players, even checking the “Two Players” checkbox on KinectManager on MainCamera.
I created a new project, and even so, skeleton tracking on the 2nd player is not working at all. Do you mind helping me with that?
Hi Matheus, have you added the second avatar (the game object containing AvatarController-component) to the Player2Avatars-list parameter of KinectManager?
My mistake. “Detect closest user” was activated, so it was not tracking the player 2 ^^
You are already an expert. 😉
Has this been updated for Kinect V2?
KinectExtras for Kinect v2 is part of the “Kinect v2 with MS-SDK“. This package here and “Kinect with MS-SDK” are for Kinect v1 only.
Hi, I’m from Colombia, and I buy your package in Asset Store 🙂 …. I work with the SpeechRecognitionDemo, but I wanna change the language to Spanish. I download the SpanishLanguage package from Microsoft web, then, In the XML, I put the language “es-ES”, and change the Tags for spanish tags; after that, I change de Language code: Not anymore 1033 for english, I put “1034” for Spanish, and I have and error… May you help me with that ?? I only wanna recognize words in Spanish for a University work.
Thank you so much !
Hi, what you do looks correct. What error exactly do you get? Also, I suppose you modify the grxml-file in the root folder of your Unity project (the folder that contains the Assets-folder), do you?
How can I read data for all skeletons? For example InteractionWrapper.GetLeftHandState() is returning data only for single currently selected skeleton. I know I can switch skeletons by using SetSkeletonTrackingID(id) but where I can get ID’s of all skeletons?
I found it: InteractionWrapper.GetInteractorsCount() and InteractionWrapper.GetSkeletonTrackingID(id);
Is this possible to detect kinect connecting & disconnecting during runtime? Some years ago using Kinect C# SDK I remember I was subscribing to KinectSensors.StatusChanged event but I cannot find any equivalent in your wrapper.
Unfortunately with the current Extras-wrapper this is not possible.
Ok, it’s not really important. How about setting elevation angle?
Add this to your script:
private static extern int SetKinectElevationAngle(int sensorAngle);
and then use the declared function to set the angle.
Unfortunately, I’m having some difficulty moving a character with the AvatarController attached. When I move it (via WASD or Unity scene editor) it moves back to the same point. Do you know of any way I can solve this?
You can add a function to the AvatarController-script that sets or changes xOffset, yOffset, zOffset-variables. Then use it for the WASD movements. Contact me by e-mail or Skype, if you have difficulties coding it.
I like so much this Unity Asset!
But I need to use more then one Kinect. Is it possible with KinectExtras with MsSDK or other assets you have been did? There is possibility to join kinects in unity for map more peoples and increase precision?
Hello Marlon, unfortunately this cannot be done with the KinectExtras or any other of my assets. If you like to implement it by yourself in the native wrapper, please contact me per e-mail and include your invoice number.
Hi Rumen, on unity pro 4.6.3 getting an error like Assets/SpeechRecognitionDemo/Scripts/ThirdPersonCamera.cs(4,14): error CS0101: The namespace `global::’ already contains a definition for `ThirdPersonCamera’. what should be done to fix this? Thank you,
Probably there is a second script with the name ThirdPersonCamera. You could rename the script (file name + class name!) in SpeechRecognitionDemo/Scripts-folder. Then update the reference to it from the MainCamera in the SR-scene.
Thanks Rumen, I have just seen your reply, so sorry for the late thanks 🙂
Hello Rumen, im new in kinect development using unity, and i have a question about your asset… The speech recognition to be exact… does it suport a different lenguaje? like spanish? because im really interested to use the kinect ms-sdk asset, but i want to be sure about it. Also, does it works with unity5?
(English is not my lenguaje, sorry for errors…)
Hi, to use a different language, you need to download and install the respective language pack from here:http://www.microsoft.com/en-us/download/details.aspx?id=34809 (if it is available) and change the language code-setting of SpeechManager-component. As to Unity 5, it is problematic to use with Kinect-v1 SDK, so if possible, refrain from upgrading for now, or install Unity 4.6 in a separate folder and use it for the K1-project
I have started expanding on your examples for my own project and right now I am particularly focused on speech recognition.
When structuring a grammar file, it appears you can build in dynamic rules that return dynamic results. I understand the grammar file structure but I can’t seem to figure out how to get the properties of the dynamic result. I am using the first example on this page:
The speech recognition works however, it only returns “meeting” instead of the properties “request:meeting” and “participants:name”
Do you know how to access these properties? Is this some limit on Kinect with Unity? Is it some limit on the code examples you have made?
I am trying to track down how to take advantage of the dynamic rules inside the grammar file. Any ideas you have would be great!
It seems as though I can only retrieve one from the grammar file. Is there a way to loop through and retrieve all the ‘s?
Hi Zac, sorry for the delay. I think it is a limitation of my implementation, but have to test it in more details. And thank you for pointing me out to this example! Please e-mail me in some time to check if I can provide you something to test.
Hi. When i build on pc my app. When i launch .exe i geting always this msg in output_log
Fallback handler could not load library C:/Users/Damel/Documents/New Unity Project 1/exe/test_Data/Mono/.\C:/Users/Damel/Documents/New Unity Project 1/exe/test_Data/Plugins/KinectUnityWrapper.dll
Check, if you have included the Resources-folder in your project. All needed libraries are there.
Hi!For face tracking multiple users,how do i to make it?
Use the public functions of FacetrackingManager that take userId as a first parameter. These userId’s you can get from the KinectManager.
error : Assets/Kinect with MS-SDK Playmaker Actions/Actions/GetKinectJointPosition.cs(34,38): error CS0426: The nested type `SkeletonJoint’ does not exist in the type `KinectWrapper’
help meee pleaseee!!
Open the action-script and replace ‘SkeletonJoint’ with ‘NuiSkeletonPositionIndex’.
at (wrapper managed-to-native) KinectWrapper:EnableKinectManager (bool)
at KinectWrapper.NuiInitialize (NuiInitializeFlags dwFlags) [0x00000] in C:\Users\erhan-pc\Documents\New Unity Project 101\Assets\KinectScripts\KinectWrapper.cs:315
at KinectManager.Awake () [0x00002] in C:\Users\erhan-pc\Documents\New Unity Project 101\Assets\KinectScripts\KinectManager.cs:871
I suppose you integrated the KinectExtras with KinectManager, but haven’t used any of the KinectExtras’ managers or demos yet. In this case, open KinectScripts/KinectManager.cs and at the start of its Awake()-method add this line ‘WrapperTools.EnsureKinectWrapperPresence();’. This will ensure the presence of the needed native libraries. Also, make sure that the Assets/Resources-folder from KinectExtras-package exists in your project.
Roman hello. How do I create a running gesture?
See here: http://forum.unity3d.com/threads/kinect-with-ms-sdk.218033/page-4#post-2331666
How do I create a new gesture of unity?
(I want to do different movements)
Hi Rumen, my name is Moises, I’m doing an interactive Kinect V2 with MS-SDK with Facetrackin for a museum in the city of Mexico, the idea is that when it detects a person, a label with a random legend track head of the person, I’m not a programmer, but as I got to do it with one person, but the idea is that the labels are tracking with multiple (4 or 5) simultaneous, people, could help me or advise me.
In advance thank you very much.
Hi Moises, look at the 2nd face-tracking demo – the one with the Viking’s hat. You need something like this, as far as I understand, only instead of a hat you would use a text-mesh for a label. You also would need to extend a bit the ModelHatController-script, to make it work with multiple users and then apply it to several “hats” on the scene. If you can’t do it alone, please contact me by e-mail at the beginning of the next week. I don’t work at weekends.
Hi Rumen, I read both kinect manager and facetracking manager to try to understand how they work, and I also read the modelhatcontroller but I still lose, I’m using the facetrackindemo 2 as a reference, we would greatly appreciate if you can help me.
Hi Moises, you don’t need to read these classes, just use them – as components and as API-sources, if needed. What you need to understand is how ModelHatController works. Please contact me by e-mail and I’ll send you an updated version of this script that lets you specify the player-index as a setting, too.
Hi Rumen. The KinectExtras with MsSDK is no longer available to download. I need it for my final project. How can i get this? Thank You.
Yes, I deprecated it, since all its content is present in the Kinect-v2 package. I still can send it to you, if you need it for educational projects.
I am in a similar position to this poster. I’ve been playing with your Kinect-with-ms-sdk to use Kinect v1 with Unity5. But I’d like to play with the face tracking as well, so I was also looking for the KinectExtras with MS-SDK package. If the Kinect-v2 package includes this though that’s great! Does the Kinect-v2 package work with the original Kinect-v1 sensor hardware (and Kinect SDK 1.8) though? And thank you for your really useful work!
Hi Jim. Yes, the K2-package includes the full KinectExtras-functionality and it works with Kinect v1 sensors, too. Here is a tip regarding this: http://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t21 But, if you specifically need the KinectExtras-asset, just e-mail me and I’ll send it to you.
I am teaching a college course and we are using the Kinect, we only have the first version available, so far it has been great but i wanted to show speech recognition, and i just cant find any example using Kinect1, you say the project with Kinect V2 would also work with V1, does the speech recognition works as well?, if so could you provide this module to us, i can send you full contact information from the institution. Thanks a lot!
Yes, it works. Please send me your request for a free package from your college e-mail address. And see this tip, with regard to Kinect v1: https://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/#t21
HOW DO I ADD NEW SPEECH? example: good
Add the tags, words and phrases to the SpeechGrammar.grxml.txt-grammar file in Assets/Resources-folder. Do it in a similar way, as the other recognized words and phrases there. Then delete the SpeechGrammar.grxml in the root-folder of your Unity project. SpeechManager will copy the modified grammar-file from the Resources-folder to the root-folder, when it starts up.
Hello there. How do I receive audio directly from print to screen kinect?
for example :
I said: “Hello, how are you?”
screen (debugtext): “Hello, how are you?”
Sorry, dictation is not supported.
speech manager running kinect manager does not work
Well, I usually test my assets before releasing them. I’ll check this and really hope you are not losing my time.
i send an email to you but i did not get reply so i leave my question here
i use kinect 1 with your ms-sdk package
i want to use colormap as background and attach my avatar to real character(like virtual fitting room)
so how can i do it?
should i use coordinatemapper.somthing? how to import that function to wrapper and needed structures?
how to use that?
please help me my friend.
Sorry, my support-time is limited and I respond only to emails related to the K2-asset, because its users have paid for it. It’s not so easy to explain how to create a virtual fitting room. Generally your model needs to match the Kinect joints’ structure, you need to use the mapping to put it on the right position over the background and you need to scale it appropriately, to match the real person. Hope this information helps you get the right ideas.
thanks for you fast reply.
please tell me (if its possible)how to import MapSkeletonPointToColor method from kinect10.dll an how to import needed structer.
thanks a lot man 🙂
You don’t need to. See GetDepthMapPosForJointPos() and GetColorMapPosForDepthPos()-functions in KinectManager.cs
dear romen you are a grate developer man.
im very Appreciate that if you introduce me a good references(or tutorials) that help me to learn about unity kinect developing even with microsoft k2 unity pro packages? is there any good references?
You don’t need a tutorial. See the examples in the SDK Browser that comes free with Kinect SDK 2.0. See also this: http://rfilkov.com/2015/01/25/kinect-v2-tips-tricks-examples/
thanks man. 🙂
First of all – Thanks for the post.
Second – does it (will it) work with Kinect (sdk) 2.0 ?
I seem to get the “Device is not connected.” exception.
No, but there is another package, which works with Kinect v2 & Kinect v1. https://www.assetstore.unity3d.com/en/#!/content/18708
Prev comment posted in the wrong blog post.
Sry, please ignore.
and again – Tnx for the Post.
(it referred to “Kinect with MS-SDK”, not the V2)