Kinect-v2 VR Examples

Kinect2_Mobile_VRKinect-v2 VR Examples is a set of Kinect-v2 (also known as ‘Kinect for Xbox One’) virtual reality examples. To run it, you need to download the ready-made build of KinectDataServer from the download section below. Alternatively, you can run or build the KinectDataServer scene by yourself, if you have the ‘Kinect v2 Examples’-package.

This package contains ten demo scenes. The avatar-demos show how to utilize Kinect-controlled avatars in virtual reality scenes. The gesture-demo scenes demonstrate how to use Kinect and visual gestures in VR scenes. The interaction-demo scenes – how to use hand interactions in virtual reality. The speech recognition demo shows how to use the Kinect speech recognition in virtual reality scenes.

All Kinect-related components used by the scenes in this package are lightweight. They don’t use the Kinect sensor directly, but instead receive the sensor data over the network, from the Kinect data server. The data-server must run on a machine, where the Kinect-v2 sensor is connected. The package supports standalone and mobile builds, as well as VR-builds for platforms like Oculus, Gear-VR, Vive, Cardboard, etc.. It works in both Unity Pro and Unity Personal editors.

This package is deprecated at Unity asset store, as of 01.Sep.2017. The reason: Its components and functionality is integrated into the K2-asset (‘Kinect-v2 Examples with MS-SDK’) now. I prefer not to maintain two projects with similar functionality. If you still need it or its demo scenes, please e-mail me your request along with the invoice number for the K2-asset you got from Unity asset store. I’ll send it to you free of charge.

Comparison of the Kinect-v2 packages:

Kinect-v2 Examples with MS-SDK (K2-asset) Kinect-v2 VR Examples (K2VR-asset)
Full-featured Kinect-v2 package. Utilizes all features and streams of the Kinect-v2 (and Kinect-v1) sensors. Mobile version of the K2-asset. Utilizes body, hand interaction, gestures and voice commands.
Gets all its data from the sensor connected to the same machine. Gets its data from KinectDataServer over the network. The KinectDataServer must run on the machine, where the sensor is connected.
Works on Windows-standalone (x86, x64) and UWP platforms. Work virtually on any platform with access to the network.

Free for education:
The package is free for academic use (i.e. in schools, colleges and universities, by students, teachers or researchers). If you match this criterion, please e-mail me to get the K2-VR asset directly from me.

One request:
My only request is NOT to share the package or its demo scenes in source form with others, without my explicit consent, regardless of whether you purchased it at Unity asset store, or got it from me free of charge for academic use. Still not sure why should I say this explicitly, but obviously it was not clear so far. Please respect my work.

How to run the examples:
1. Make sure the Kinect data server is running on a server machine, where the Kinect-v2 sensor is connected.
2. Download and import this package into new Unity project.
3. Open ‘Build settings’ and switch to ‘Android’, ‘iOS’ or ‘PC, Mac & Linux Standalone’ platform.
4. Open a demo scene of your choice. The demo-scenes are located in Assets/Kinect2MobileVr/DemoScenes-folder.
5. If the server is running on the same WLAN subnet, it should be discovered automatically when the scene runs. You don’t need to change any server-related settings in this case.
6. Alternatively, you can manually set the ‘Server host’ and ‘Server port’-settings of KinectDataClient-component of the KinectController game object, if you know the Kinect data server’s IP-address and port.
7. Run the scene in the editor to make sure it works. This means the configured server-settings are correct.
8. Build the scene for a mobile or virtual reality platform and test it on a mobile device. Make sure the mobile device is on the same WLAN subnet as the Kinect data server.

For more information and short descriptions of the available demo scenes, see the Readme-file in the Assets/Kinect2MobileVr/_Readme-folder of the package.

* The official release of ‘Kinect v2 VR Examples’-package is deprecated at Unity Asset Store. Look above for more info.
* For ‘Kinect-v2 VR Examples’ v1.2 and above – here us the KinectDataServer v1.4, built for different versions of Unity:

Please download the respective file for the Unity version you currently use. Then unzip it in a folder on the machine, where the Kinect sensor is connected. When you run it for a first time, allow the public network access, if the operating system asks to. Otherwise the Kinect data client may not be able to connect to the server.

Short Descriptions of the Demo Scenes
1. KinectAvatarDemo1 – This is first-person avatar demo. Move your arms or legs, and try to look at them, to see how the sensor tracks you.
2. KinectAvatarDemo2 – This is third-person avatar demo. Again, move your arms and legs, move or turn a bit left or right, to see your mirrored movements from 3rd person perspective.
3. FlyerGestureDemo – Lean left or right to move the flyer horizontally – left or right. Jump or squat to move it vertically.
4. KinectGestureDemo1 – This is the discrete gestures’ demo. Swipe left, swipe right or swipe up turn the presentation cube in the respective direction.
5. KinectGestureDemo2 – This is the continuous gestures’ demo. Use the Wheel-gesture to turn the model left or right, or Zoom-in/Zoom-out gestures, to scale the model. Lower your hands between the gestures, to stop the previous gesture.
6. VisualGestureDemo – This is a very basic demo that allows you to check how the visual gestures, configured at server-side, get recognized.
7. KinectInteractionDemo1 – Use your left or right hand to control the hand-cursor. Grab an object & drag it around. Open your hand to release it. Try to interact with the UI components, too.
8. KinectInteractionDemo2 – Grab the cube with your left or right hand. Then turn it in all directions, to look at all its sides.
9. SnowflakeShooterDemo – Look at the falling snowflakes. Close your left or right hand to shoot the snowflake you’re looking at. Keep in mind your shooting hand must be high enough, so the sensor could clearly see its state.
10. KinectSpeechRecognition – Say clearly one of the listed commands to control the robot. The grammar is configured at server side.

* If the demo scenes cannot connect to the Kinect data server, try to manually set the ‘Server host’ and ‘Server port’-settings of KinectDataClient-component of the KinectController game object.
* Make sure the Kinect data server is running on a server machine and Kinect-v2 sensor is connected to it.
* Many Kinect-related tips, tricks and examples are available here.
* The Online documentation of the K2-asset can be found here.

What’s new in version 1.2:
1. Added AvatarScaler- and HeadMover-components and new, first-person avatar-demo scene.
2. Added two gesture-demo scenes – for discrete and continuous gestures.
3. Added two interaction-demo scenes. Tailored InteractionManager for VR use.
4. Added VisualGestureManager-component and visual-gesture demo scene.
5. Added SpeechManager-component and speech-recognition demo scene.
6. Removed multi-scene demo. KinectManager and KinectDataClient may be used in each scene now.
7. Renamed previous (more complex) demo scenes. Rearranged demo-scene folders.

Video worth more than 1000 words:
Here is a video, courtesy of by Ricardo Salazar, created with this package, Kinect data server, Kinect-v2 sensor and Samsung’s GearVR HMD:


142 thoughts on “Kinect-v2 VR Examples

  1. hey man, i have doubit i want to create a controller similar to Kinect Sports Rivals utilizing handstate lasso but unfortunately i can’t get to work easyly sometimes kinect lost track of hand or thumbs! what u think is the best way to create a shooter in kinect utilizing handstates? without getting too much false positives?

    • Hi, I think hand grips and releases are quite reliable, when the user stays frontally to the sensor. In the K2-asset I consider both grips and lassos as hand grips. This decreases the number of false positives. You could also define some “window” around the body center, where to consider the hand states as valid. According to my experience, the most of these false positives come when the hands are too low or too far from the body.

      • thank’s i have less false positives configuring Kinect v2 angle and better illumination. now it’s better for hand tracking, and I also now use the open/lasso and release as aiming state and when close and grip as shoot! and add fire delay to better control when to shoot. now I’m getting far less false positives and i started to get something similar to Kinect sports rival shooting game but with RiftCat i can use head position to create a Kinect v2 (VR) shooting game. thank’s again. i will create some demos and soon send to you through email.

  2. That’s nice. I was wondering how were you able to solve the hair detection problem on the background removal module. The scanners seems to have problems scanning through thin amount of hair. Thanks~

    • There are still issues with the hair, I think. But you can minimize them with proper lighting. For instance, place a light in front and above the user(s).

  3. Highly supportive developer ~ Will keep an eye on Rumen in the future should new unity products arise for our needs. Thanks!

    • Do you mean the “Kinect with MS-SDK”-asset, or something else? For point cloud, you should always have both the color data and the depth data.

Leave a Reply