
The avatar-demo scenes show how to utilize Kinect-controlled avatars in your scenes, gesture demo – how to use discrete and continuous gestures in your projects, fitting room demos – how to overlay or blend the user’s body with virtual models, background removal demo – how to display user silhouettes on virtual background, point cloud demos – how to present the real environment or users as meshes in your scene, etc. Short descriptions of all demo-scenes are available in the online documentation.
This package works with Azure Kinect, Femto Bolt and Femto Mega sensors, Kinect-v2 (aka Kinect for Xbox One) and iPhone-Pro LiDAR sensors. It can be used with all versions of Unity – Free, Plus & Pro.
If you need a package with similar functionality, components and demo scenes that works with regular cameras, please look at Computer Vision Examples for Unity.
How to run the demo scenes:
1a. (Azure Kinect) Download and install the latest release of Azure-Kinect Sensor SDK. The download link is below. Then open ‘Azure Kinect Viewer’ to check, if the sensor works as expected.
1b. (Femto Bolt and Mega) Download and unzip the latest release of Orbbec Viewer, as well as Orbbec SDK K4A Wrapper. The download links are below. Then first open ‘Orbbec Viewer’ and then ‘K4A Viewer’ to check, if the sensor works as expected.
2. (Azure Kinect and Femto Bolt/Mega) Follow the instructions on how to download and install the latest release of Azure-Kinect Body Tracking SDK. It is used by all body-tracking related scenes, regardless of the camera. The download link is below.
3. (Kinect-v2) Download and install Kinect for Windows SDK 2.0. The download link is below.
4. (iPhone Pro) For integration with the iPhone Pro’s LiDAR sensor, please look at this tip.
5a. Import this package into a new Unity project.
5b. (Femto Bolt and Mega) Please follow the steps in this tip.
6. Open ‘File / Build settings’ and make sure that ‘Windows’ is the current active platform, and the architecture is set to ‘Intel 64 bit’.
7. Make sure that ‘Direct3D11’ is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
8. Open and run a demo scene of your choice from a subfolder of the ‘AzureKinectExamples/KinectDemos’-folder. Short descriptions of all demo-scenes are available in the online documentation.
* The latest Azure Kinect Sensor SDK (v1.4.1) can be found here.
* The latest release of Orbbec Viewer can be found here.
* The latest Orbbec K4A-Wrapper (K4A Viewer) can be found here.
* The latest Azure Kinect Body Tracking SDK (v1.1.2) can be found here.
* Older releases of Azure Kinect Body Tracking SDK can be found here.
* Instructions how to install the body tracking SDK can be found here.
* Kinect for Windows SDK 2.0 can be found here.
* RealSense SDK 2.0 can be found here.
Downloads:
* The K4A-asset can be downloaded from Unity Asset store.
* All updates are (and always will be) free of any charge.
* If you’d like to try the free version of the K4A-asset, you can find it here.
* If you’d like to replace Azure Kinect with Orbbec’s Femto Bolt or Mega sensors, please follow the steps in this tip.
* If you’d like to utilize the LiDAR sensor on your iPhone-Pro or iPad-Pro as a depth sensor, please look at this tip.
* (Deprecated) The support of body tracking for Intel RealSense-sensor is deprecated.
One request:
Please don’t share this package or its demo scenes in source form with others, or as part of public repositories, without my explicit consent.
Documentation:
* The basic documentation is in the Readme-pdf file, in the package.
* The K4A-asset online documentation is available here.
* Many K4A-package tips, tricks and examples are available here.
Troubleshooting:
* If the scene starts too slowly and your graphics card is NVidia RTX 50×0 or newer, please set the ‘Body tracking processing mode’-setting of the Kinect4AzureInterface-component in the scene to DirectML. This happens because the Body Tracking SDK uses an older version of the CUDA library, which may not work well with newer graphics cards. See this comment and this comment.
* If the ‘Body tracking processing mode’-setting of the Kinect4AzureInterface-component is set to DirectML and your scene crashes occasionally, please close the Unity Editor, find ‘directml.dll’ in ‘AzureKinectExamples/SDK/Kinect4AzureSDK/Plugins’-folder and delete it. Then restart the Unity Editor and run the scene again. It should use ‘directml.dll’ from your machine’s system folder.
* If you’re calibrating two or more synchronized cameras in the MultiCameraSetup-scene and you’re experiencing freezes or crashes, please disable the ‘Sync multi-cam frames’-setting on the KinectManager-component. Then try again.
* If the Femto camera only updates user pose once then stops, please open the camera in ‘k4aviewer’ (part of Orbbec’s K4A-wrapper) and check if the timestamps of the camera streams are still rolling. If they’ve stopped, please repeat step 3b of this tip. Then run again ‘k4aviewer’ and the Unity project, to check if the camera works without issues.
* If Unity editor freezes or crashes at the scene start, please make sure the path where the Unity project resides does not contain any non-English characters.
* If you get errors like ‘Texture2D’ does not contain a definition for ‘LoadImage’ or ‘Texture2D’ does not contain a definition for ‘EncodeToJPG’, please open the Package Manager, select ‘Built-in packages’ and enable ‘Image conversion’ and ‘Physics 2D’ packages.
* If you get errors like ‘Can’t create body tracker for Kinect4AzureInterface0!‘, please follow these tips:
- Check, if you have installed the Body Tracking SDK v1.1.2 into ‘C:\Program Files\Azure Kinect Body Tracking SDK’-folder.
- Start the ‘Azure Kinect Body Tracking Viewer’, and check if it works as expected.
- Please note, the ‘Azure Kinect Body Tracking Viewer’ by default uses DirectML-processing mode, while the K4A-asset by default uses CUDA-processing mode (for performance reasons).
- If you have NVidia GPU and prefer to stay with the CUDA processing mode for body tracking, please make sure you have installed the latest NVidia driver. See this link for more information. To make sure the CUDA processing more works on your machine, please open the command prompt (cmd), type ‘cd C:\Program Files\Azure Kinect Body Tracking SDK\tools’, and then run ‘k4abt_simple_3d_viewer CUDA’. This will start the ‘Azure Kinect Body Tracking Viewer’ with CUDA-processing mode.
- Otherwise, if CUDA doesn’t work or you prefer to use DirectML (as ‘Azure Kinect Body Tracking Viewer’ does) in the K4A-asset too, please open the ‘Kinect4AzureInterface.cs’-script in ‘AzureKinectExamples/KinectScripts/Interfaces’-folder, find this line: ‘public k4abt_tracker_processing_mode_t bodyTrackingProcessingMode = k4abt_tracker_processing_mode_t.K4ABT_TRACKER_PROCESSING_MODE_GPU_CUDA;‘, and replace it with ‘private k4abt_tracker_processing_mode_t bodyTrackingProcessingMode = k4abt_tracker_processing_mode_t.K4ABT_TRACKER_PROCESSING_MODE_GPU_DIRECTML;‘. Then save the script, return to Unity and try to run the demo scene again.
* If the scene works in Unity editor, but doesn’t work in the build, please check if the ‘Architecture’ in Build settings is ‘x86_64’, and the ‘Scripting backend’ in Player settings is set to ‘Mono’.
* If you can’t upgrade the K4A-package in your project to the latest release, please go to ‘C:/Users/{user-name}/AppData/Roaming/Unity/Asset Store-5.x’ on Windows or ‘/Users/{user-name}/Library/Unity/Asset Store-5.x’ on Mac, find and delete the currently downloaded package, and then try again to download and import it.
* If Unity editor freezes or crashes at the scene start, please make sure the path where the Unity project resides does not contain any non-English characters.
* If you get syntax errors in the console like “The type or namespace name ‘UI’ does not exist…”, please open the Package manager (menu Window / Package Manager) and install the ‘Unity UI’ package. The UI elements are extensively used in the K4A-asset demo scenes.
* If you get “‘KinectInterop.DepthSensorPlatform’ does not contain a definition for ‘DummyK2′” in the console, please delete ‘DummyK2Interface.cs’ from the KinectScripts/Interfaces-folder. This dummy interface is replaced now with DummyK4AInterface.cs.
* If the Azure Kinect sensor cannot be started, because StartCameras()-method fails, please check again #6 in ‘How to run the demo scenes‘-section above.
* If you get a ‘Can’t create the body tracker’-error message, please check again #2 in ‘How to run the demo scenes‘-section above. Check also, if the Body Tracking SDK is installed in ‘C:\Program Files\Azure Kinect Body Tracking SDK’-folder.
* If the body tracking stops working at run-time or the Unity editor crashes without notice, update to the latest version of the Body tracking SDK. This is a known bug in BT SDK v0.9.0.
* RealSense support is deprecated. The Cubemos skeleton tracking SDK is not available anymore. For more information please look at this tip.
* If there are errors like ‘Shader error in [System 1]…’, while importing the K4A-asset, please note this is not really an error, but shader issues due to missing HDRP & VFX packages. You only need these packages for the Point-cloud demo. All other scenes should be started without any issues.
* If there are compilation errors in the console, or the demo scenes remain in ‘Waiting for users’-state, make sure you have installed the respective sensor SDKs and the other needed components. Please also check, if the sensor is connected.
What’s New in Version 1.21:
1. Upgraded the ‘Azure Kinect and Femto Bolt Examples’ package to Unity 6.0.
2. Fixed the incorrect-face-rectangle issue in the ShowFaceImage-component (KinectFaceDemo2).
3. Fixed offset-node following issue in the AvatarController-component.
4. Cosmetic changes in various components and demo scenes.
Videos worth more than 1000 words:
Frontier (composed by Yang Bang-eon) Metaverse Orchestra Version. Meet the hybrid metaverse performance at Voces Choir’s on-site concert, created by Voces Choir:
Bushnell Center for the Performing Arts, created by Patrick Belanger:
Here is a holographic setup, created by i-mmersive GmbH, with Unity 2019.1f2, Azure Kinect sensor and “Azure Kinect Examples for Unity”:
Last but not least, I love sharing creator’s work, like this one by James Bragg.
