Azure Kinect Examples for Unity, v1.1 is a set of Azure Kinect (aka ‘Kinect for Azure’, K4A) examples that use several major scripts, grouped in one folder. The package currently contains about ten demo scenes. Apart of Azure Kinect, the K4A-package supports the “classic” Kinect-v2 (aka Kinect for Xbox One) sensor, as well as Intel RealSense D400-series sensors.
Please note, this package is work in progress and issues are possible. The underlying Azure Kinect Sensor SDK is in early state, and the Body Tracking SDK is even in preview state. Please expect many updates and fixes over time. As always, these updates will be available free of charge to all customers.
The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture demo – how to use discrete and continuous user gestures in your projects, overlay demos – how to overlay body parts with virtual objects, point cloud demo – how to get volumetric image of the environment in your scene, etc. Short descriptions of all demo-scenes are available in the Readme-pdf file in the package, as well as in the online documentation.
This package works with Azure Kinect (aka Kinect for Azure), Kinect-v2 (aka Kinect for Xbox One) and Intel RealSense D415 & D435 sensors. It can be used with all versions of Unity – Free, Plus & Pro.
How to run the demo scenes:
1. (Azure Kinect) Download and install the Azure-Kinect Sensor SDK. The download link is below. Then open ‘Azure Kinect Viewer’ to check, if the sensor works as expected.
2. (Azure Kinect) Follow the instructions on how to download and install the Azure-Kinect Body Tracking SDK and its related components. The link is below. Then open ‘Azure Kinect Body Tracking Viewer’ to check, if the body tracker works as expected.
3. (Kinect-v2) Download and install Kinect for Windows SDK 2.0. The download link is below.
4. (RealSense) Download and install RealSense SDK 2.0. The download link is below.
5. Import this package into new Unity project.
6. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’ & Architecture: ‘x86_64’.
7. Make sure that ‘Direct3D11’ is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
8. Open and run a demo scene of your choice from a subfolder of the ‘AzureKinectExamples/KinectDemos’-folder. Short descriptions of all demo-scenes are available in the online documentation.
* Azure Kinect Sensor SDK can be found here.
* The instructions how to install the body tracking SDK can be found here.
* Kinect for Windows SDK 2.0 can be found here.
* RealSense SDK 2.0 can be found here.
* Please purchase the package from one of the following online shops. Then all future updates will be available to you free of any charge:
* The K4A-asset may always be purchased and downloaded in my online shop.
* Some day it should be published soon to the Unity Asset store, as well. Nobody knows when.
* The free version of this package is available on GitHub.
* The basic documentation is in the Readme-pdf file, in the package.
* The K4A-asset online documentation is available here.
* The K4A-package tips, tricks and examples will be available soon.
* If the Azure Kinect sensor cannot be started, because StartCameras()-method fails, please check again #7 of ‘How to run the demo scenes‘-section above.
* If you get a ‘Can’t create the body tracker’-error message, please check again #2 of ‘How to run the demo scenes‘-section above. If the issue persists, please contact me.
* If the body tracking stops working at runtime or the Unity editor crashes without notice, please restart Unity and try again. This is due to a known bug in the body tracker that should have been fixed already.
* The body tracking does not work currently with the RealSense D400-series sensors. This may change in a future release.
* If there are compilation errors in the console, or the demo scenes remain in ‘Waiting for users’-state, make sure you have installed the respective sensor SDKs, as well as the other needed components. Please also check, if the sensor is connected.
What’s New in Version 1.1:
1. Replaced DepthEngine 1.0 with 2.0, to conform to Azure Kinect Sensor SDK 1.2.
2. Added two gesture demo scenes, to demonstrate discrete and continuous gesture tracking.
3. Added multi-scene demo, to show how to use the KinectManager in multi-scene projects.
4. Added ‘Point cloud resolution’-setting to the sensor interface components, to allow depth-to-color and color-to-depth image conversions.
5. Added IMU rotation tracking and FollowSensorTransform-component, to allow sensor pose tracking.
6. Fixed AvatarController issue that caused the model to freeze, when the user ID changes.
7. Multiple updates, improvements and bug fixes, reported by the users.