Kinect-v2 VR Examples is a set of Kinect-v2 (also known as ‘Kinect for Xbox One’) virtual reality examples. To run it, you need to download the ready-made build of KinectDataServer from the download section below. Alternatively, you can run or build the KinectDataServer scene by yourself, if you have the ‘Kinect v2 Examples’-package.
This package contains ten demo scenes. The avatar-demos show how to utilize Kinect-controlled avatars in virtual reality scenes. The gesture-demo scenes demonstrate how to use Kinect and visual gestures in VR scenes. The interaction-demo scenes – how to use hand interactions in virtual reality. The speech recognition demo shows how to use the Kinect speech recognition in virtual reality scenes.
All Kinect-related components used by the scenes in this package are lightweight. They don’t use the Kinect sensor directly, but instead receive the sensor data over the network, from the Kinect data server. The data-server must run on a machine, where the Kinect-v2 sensor is connected. The package supports standalone and mobile builds, as well as VR-builds for platforms like Oculus, Gear-VR, Vive, Cardboard, etc.. It works in both Unity Pro and Unity Personal editors.
Free for education:
The package is free for academic use (i.e. in schools, colleges and universities, by students, teachers or researchers). If you match this criterion, please e-mail me to get the K2-VR asset directly from me.
My only request is NOT to share the package or its demo scenes in source form with others, without my explicit consent, regardless of whether you purchased it at Unity asset store, or got it from me free of charge for academic use. Still not sure why should I say this explicitly, but obviously it was not clear so far. Please respect my work.
How to run the examples:
1. Make sure the Kinect data server is running on a server machine, where the Kinect-v2 sensor is connected.
2. Download and import this package into new Unity project.
3. Open ‘Build settings’ and switch to ‘Android’, ‘iOS’ or ‘PC, Mac & Linux Standalone’ platform.
4. Open a demo scene of your choice. The demo-scenes are located in Assets/Kinect2MobileVr/DemoScenes-folder.
5. If the server is running on the same WLAN subnet, it should be discovered automatically when the scene runs. You don’t need to change any server-related settings in this case.
6. Alternatively, you can manually set the ‘Server host’ and ‘Server port’-settings of KinectDataClient-component of the KinectController game object, if you know the Kinect data server’s IP-address and port.
7. Run the scene in the editor to make sure it works. This means the configured server-settings are correct.
8. Build the scene for a mobile or virtual reality platform and test it on a mobile device. Make sure the mobile device is on the same WLAN subnet as the Kinect data server.
For more information and short descriptions of the available demo scenes, see the Readme-file in the Assets/Kinect2MobileVr/_Readme-folder of the package.
* The official release of ‘Kinect v2 VR Examples’-package is available at Unity Asset Store.
* For ‘Kinect-v2 VR Examples’ v1.2 and above – here us the KinectDataServer v1.4, built for different versions of Unity:
- KinectDataServer v1.4, built for Unity 5.3
- KinectDataServer v1.4, built for Unity 5.4
- KinectDataServer v1.4, built for Unity 5.5
- KinectDataServer v1.4, built for Unity 5.6
Please download the respective file for the Unity version you currently use. Then unzip it in a folder on the machine, where the Kinect sensor is connected. When you run it for a first time, allow the public network access, if the operating system asks to. Otherwise the Kinect data client may not be able to connect to the server.
Short Descriptions of the Demo Scenes
1. KinectAvatarDemo1 – This is first-person avatar demo. Move your arms or legs, and try to look at them, to see how the sensor tracks you.
2. KinectAvatarDemo2 – This is third-person avatar demo. Again, move your arms and legs, move or turn a bit left or right, to see your mirrored movements from 3rd person perspective.
3. FlyerGestureDemo – Lean left or right to move the flyer horizontally – left or right. Jump or squat to move it vertically.
4. KinectGestureDemo1 – This is the discrete gestures’ demo. Swipe left, swipe right or swipe up turn the presentation cube in the respective direction.
5. KinectGestureDemo2 – This is the continuous gestures’ demo. Use the Wheel-gesture to turn the model left or right, or Zoom-in/Zoom-out gestures, to scale the model. Lower your hands between the gestures, to stop the previous gesture.
6. VisualGestureDemo – This is a very basic demo that allows you to check how the visual gestures, configured at server-side, get recognized.
7. KinectInteractionDemo1 – Use your left or right hand to control the hand-cursor. Grab an object & drag it around. Open your hand to release it. Try to interact with the UI components, too.
8. KinectInteractionDemo2 – Grab the cube with your left or right hand. Then turn it in all directions, to look at all its sides.
9. SnowflakeShooterDemo – Look at the falling snowflakes. Close your left or right hand to shoot the snowflake you’re looking at. Keep in mind your shooting hand must be high enough, so the sensor could clearly see its state.
10. KinectSpeechRecognition – Say clearly one of the listed commands to control the robot. The grammar is configured at server side.
* If the demo scenes cannot connect to the Kinect data server, try to manually set the ‘Server host’ and ‘Server port’-settings of KinectDataClient-component of the KinectController game object.
* Make sure the Kinect data server is running on a server machine and Kinect-v2 sensor is connected to it.
* Many Kinect-related tips, tricks and examples are available here.
* The Online documentation of the K2-asset can be found here.
What’s new in version 1.2:
1. Added AvatarScaler- and HeadMover-components and new, first-person avatar-demo scene.
2. Added two gesture-demo scenes – for discrete and continuous gestures.
3. Added two interaction-demo scenes. Tailored InteractionManager for VR use.
4. Added VisualGestureManager-component and visual-gesture demo scene.
5. Added SpeechManager-component and speech-recognition demo scene.
6. Removed multi-scene demo. KinectManager and KinectDataClient may be used in each scene now.
7. Renamed previous (more complex) demo scenes. Rearranged demo-scene folders.
Video worth more than 1000 words:
Here is a video, courtesy of by Ricardo Salazar, created with this package, Kinect data server, Kinect-v2 sensor and Samsung’s GearVR HMD: