Azure Kinect Examples for Unity

Azure Kinect Examples for Unity, v1.16.2 is a set of Azure Kinect (aka ‘Kinect for Azure’, K4A) examples that use several major scripts, grouped in one folder. The package currently contains over thirty five demo scenes. Apart of the Azure Kinect sensor (aka K4A), the K4A-package supports the “classic” Kinect-v2 (aka Kinect for Xbox One) sensor, as well as Intel RealSense D400-series sensors.

The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture demo – how to use discrete and continuous gestures in your projects, fitting room demos – how to overlay or blend the user’s body with virtual models, background removal demo – how to display user silhouettes on virtual background, point cloud demos – how to show the environment or users as meshes in your scene, etc. Short descriptions of all demo-scenes are available in the online documentation.

This package works with Azure Kinect (aka Kinect for Azure, K4A), Kinect-v2 (aka Kinect for Xbox One) and Intel RealSense D400-series sensors. It can be used with all versions of Unity – Free, Plus & Pro.

How to run the demo scenes:
1. (Azure Kinect) Download and install the latest release of Azure-Kinect Sensor SDK. The download link is below. Then open ‘Azure Kinect Viewer’ to check, if the sensor works as expected.
2. (Azure Kinect) Follow the instructions on how to download and install the latest release of Azure-Kinect Body Tracking SDK and its related components. The link is below. Then open ‘Azure Kinect Body Tracking Viewer’ to check, if the body tracker works as expected.
3. (Kinect-v2) Download and install Kinect for Windows SDK 2.0. The download link is below.
4. (RealSense) Download and install RealSense SDK 2.0. The download link is below.
5. Import this package into a new Unity project.
6. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’ & Architecture: ‘x86_64’, and in ‘Player settings’ make sure the ‘Scripting backend’ is set to ‘Mono’.
7. Make sure that ‘Direct3D11’ is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
8. Open and run a demo scene of your choice from a subfolder of the ‘AzureKinectExamples/KinectDemos’-folder. Short descriptions of all demo-scenes are available in the online documentation.

* The latest Azure Kinect Sensor SDK (v1.4.1) can be found here.
* The latest Azure Kinect Body Tracking SDK (v1.1.0) can be found here.
* Older releases of Azure Kinect Body Tracking SDK can be found here.
* Instructions how to install the body tracking SDK can be found here.

* Kinect for Windows SDK 2.0 can be found here.
* RealSense SDK 2.0 can be found here.

* The K4A-asset may be purchased and downloaded in the Unity Asset store. All updates are and will be available to all customers, free of any charge.
* If you’d like to try the free version of the K4A-asset, you can find it here.
* If you’d like to utilize the Intel RealSense sensor with Cubemos body tracking, please look at this tip.
* If you’d like to utilize the LiDAR sensor on your iPhone-Pro or iPad-Pro as a depth sensor, please look at this tip.

Free for education:
The package is free for academic use. If you are a student, lecturer or university researcher, please e-mail me to get a free copy of the K4A-asset directly from me.

One request:
Please don’t share this package or its demo scenes in source form with others, or as part of public repositories, without my explicit consent.

* The basic documentation is in the Readme-pdf file, in the package.
* The K4A-asset online documentation is available here and as pdf.
* Many K4A-package tips, tricks and examples are available here.

* If you get errors like ‘Can’t create body tracker for Kinect4AzureInterface0!‘, please follow these tips:

  • Check, if you have installed the Body Tracking SDK v1.1.0 into ‘C:\Program Files\Azure Kinect Body Tracking SDK’-folder.
  • Start the ‘Azure Kinect Body Tracking Viewer’, and check if it works as expected.
  • Please note, the ‘Azure Kinect Body Tracking Viewer’ by default uses DirectML-processing mode, while the K4A-asset by default uses CUDA-processing mode (for performance reasons).
  • If you have NVidia GPU and prefer to stay with the CUDA processing mode for body tracking, please make sure you have installed the latest NVidia driver. See this link for more information. To make sure the CUDA processing more works on your machine, please open the command prompt (cmd), type ‘cd C:\Program Files\Azure Kinect Body Tracking SDK\tools’, and then run ‘k4abt_simple_3d_viewer CUDA’. This will start the ‘Azure Kinect Body Tracking Viewer’ with CUDA-processing mode.
  • Otherwise, if CUDA doesn’t work or you prefer to use DirectML (as ‘Azure Kinect Body Tracking Viewer’ does) in the K4A-asset too, please open the ‘Kinect4AzureInterface.cs’-script in ‘AzureKinectExamples/KinectScripts/Interfaces’-folder, find this line: ‘public k4abt_tracker_processing_mode_t bodyTrackingProcessingMode = k4abt_tracker_processing_mode_t.K4ABT_TRACKER_PROCESSING_MODE_GPU_CUDA;‘, and replace it with ‘private k4abt_tracker_processing_mode_t bodyTrackingProcessingMode = k4abt_tracker_processing_mode_t.K4ABT_TRACKER_PROCESSING_MODE_GPU_DIRECTML;‘. Then save the script, return to Unity and try to run the demo scene again.

* If the scene works in Unity editor, but doesn’t work in the build, please check if the ‘Architecture’ in Build settings is ‘x86_64’, and the ‘Scripting backend’ in Player settings is set to ‘Mono’.
* If you can’t upgrade the K4A-package in your project to the latest release, please go to ‘C:/Users/{user-name}/AppData/Roaming/Unity/Asset Store-5.x’ on Windows or ‘/Users/{user-name}/Library/Unity/Asset Store-5.x’ on Mac, find and delete the currently downloaded package, and then try again to download and import it.
* If Unity editor freezes or crashes at the scene start, please make sure the path where the Unity project resides does not contain any non-English characters.
* If you get syntax errors in the console like “The type or namespace name ‘UI’ does not exist…”, please open the Package manager (menu Window / Package Manager) and install the ‘Unity UI’ package. The UI elements are extensively used in the K4A-asset demo scenes.
* If you get “‘KinectInterop.DepthSensorPlatform’ does not contain a definition for ‘DummyK2′” in the console, please delete ‘DummyK2Interface.cs’ from the KinectScripts/Interfaces-folder. This dummy interface is replaced now with DummyK4AInterface.cs.
* If the Azure Kinect sensor cannot be started, because StartCameras()-method fails, please check again #6 in ‘How to run the demo scenes‘-section above.
* If you get a ‘Can’t create the body tracker’-error message, please check again #2 in ‘How to run the demo scenes‘-section above. Check also, if the Body Tracking SDK is installed in ‘C:\Program Files\Azure Kinect Body Tracking SDK’-folder.
* If the body tracking stops working at run-time or the Unity editor crashes without notice, update to the latest version of the Body tracking SDK. This is a known bug in BT SDK v0.9.0.
* Regarding RealSense: If you’d like to try its integration with Cubemos body tracking SDK, please look at this tip.
* If there are errors like ‘Shader error in [System 1]…’, while importing the K4A-asset, please note this is not really an error, but shader issues due to missing HDRP & VFX packages. You only need these packages for the Point-cloud demo. All other scenes should be started without any issues.
* If there are compilation errors in the console, or the demo scenes remain in ‘Waiting for users’-state, make sure you have installed the respective sensor SDKs and the other needed components. Please also check, if the sensor is connected.

What’s New in Version 1.16.x:
1. Added support for Azure Kinect Body Tracking SDK v1.1.0 (big thanks to 葛西浩!).
2. Added optional bone colliders to the AvatarController-component.
3. Updated BackgroundUserBodyImage- & BackgroundColorCamUserImage-components to support individual user indexes (thanks to Tepat Huleux).
4. Fixed SceneMeshDemo and BackgroundRemovalDemo2-scenes to limit the space in camera coordinates (thanks to Tomas Durkin).
5. Fixed the dst/cst transfer issue in case of KinectNetServer running with ARKit sensor interface.
6. (1.16.1) Added ‘Body tracking model type’-setting to the Kinect4AzureInterface-component, to allow switching between the full or lite body tracking models.
7. (1.16.1) Added KinectStatusChecker-component (thanks to Jess Deacon), and JointAngleCalculator-sample (thanks to Frank Fedel).
8. (1.16.1) Fixed body-merge issue in case of multiple RealSense sensors (thanks to Jorge Pelaez).
9. (1.16.2) Added ‘Fixed step indices’-user detection order (thanks to Jasper Cook).
10. (1.16.2) Improved SwipeUp-gesture detection (thanks to Andy Cockayne).
11. (1.16.2) Fixed calculation of color & depth overlay positions in case of multiple sensors (thanks to Casey Farina).

Videos worth more than 1000 words:
Here is a holographic setup, created by i-mmersive GmbH, with Unity 2019.1f2, Azure Kinect sensor and “Azure Kinect Examples for Unity”:


145 thoughts on “Azure Kinect Examples for Unity

  1. Hi Rumen, Firstly, I’d like to say that this package is a great tool.
    I’m working on an Unity project using Azure Kinect Examples for Unity for couple of months. It is fine all the time.
    Windows forced updating my system yesterday. After that update finish my project can’t run normally. I had tried some of examples of Azure Kinect Examples for Unity, running the BackgroundRemovalDemo1 scene was showing a small window with title “Unity Editor – Unity 2019.3.3f1_7…” and was terminated immediately. While running BackgroundRemovalDemo2 was everything fine. I found that if I set the parameter of “GetBodyFrames” of Kinect Manager Component to “none” it will be fine, otherwise, it failed to run.
    How do I solve this problem?

    • Hi,

      Obviously there is a problem with the Body Tracking SDK. Please run some of the problematic scenes in the Editor (I suppose the 1st avatar demo will be problematic, as well), then close Unity editor, and e-mail me the Editor’s log file, so I can take a closer look. Here you can see where to find Unity log-files:

  2. Hi Rumen,
    Color camera fps dosen’t look like 30fps.
    But It works perfectly in Azure Kinect Viewer.
    How can I fix it?

    • Hi, I’m not sure what you mean by “fps doesn’t look like 30fps”. Please e-mail me some video clip that shows the issue, the current settings of the KinectManager-component in the scene, and the version of the K4A-asset you are using.

  3. Hi Rumen!

    I have a problem with a Sensor data parameter in Kinect manager: “Get Pose Frames”. By default it has “None” in your plug-in, but I need to have it in “Raw Pose Data” in my project.

    When I use the Kinect4Azure sensor with this parameter set to something other than “None”, both body tracking and application performance drops considerably (it works at 8 fps).

    Do you know why this happens with Kinect Azure? With kinect v2 this problem does not occur to me, since my project must have support to use either of the two sensors.


    OS: Windows 10
    K4Aasset: v1.16

    • Hi. Yes, you are right. This setting (when not None) tries to estimate the sensor’s pose, i.e. its position and rotation. In case of Azure Kinect it determines the height of the sensor above the ground and the sensor rotation around the X & Z axes. This involves some complex computations on the GPU & CPU side, and this causes the performance issues. But usually Azure Kinect is static and its pose needs to be determined only once – after the sensor gets installed. In this regard, please see this tip: , as well as the next one, if you intend to use multiple sensors in your scenes.

      • Thanks! I ended up using the “EnablePoseStream(true/false)” flag to activate and deactivate the imu frames in time for a few seconds, and thus get the position and rotation of the sensor without problems

  4. Hi, I am looking for a help for the following scenario :

    My scenario is I need to capture a color image with Azure Kinect DK and that image has to be downloaded as a PNG or JPG format on my pc using Unity. Can you please share a few references for this scenario implementation.

    • Hi, this should be pretty straight forward. You would need something like this in your script:

      Texture2D texColorImage = (Texture2D)KinectManager.Instance.GetColorImageTex(0);
      byte[] btColorImage = texColorImage.EncodeToPNG();
      System.IO.File.WriteAllBytes("ColorImage.png", btColorImage);

      Please don’t forget to set ‘Get color frames’-setting of the KinectManager-component in the scene to ‘Color texture’, too.

Leave a Reply