Hi, my name is Rumen Filkov, or just Rumen. I’m an electrical engineer, MSc in computer science, stubborn entrepreneur and life-long learner. Currently I work as CTO at Quanterall (a company that constantly builds innovative, scalable, and high-performance end-to-end customized software solutions for its clients), as well as on my own entrepreneurial R&D projects.
A big part of my research efforts, as well as my private R&D projects, are in the following fields:
- Natural user interface (NUI) devices like Microsoft Kinect (Azure Kinect, Kinect-v2, Kinect-v1), RealSense D400 series, as well as augmented reality (AR) devices like HoloLens 1 & 2.
- Virtual and augmented reality (VR/AR) technologies, powered by the Unity3D game engine.
- Image analysis and classification algorithms in the areas of depth estimation, body tracking, face detection, emotion recognition, background segmentation, object detection and the likes.
- Regression and classification algorithms for user experience analysis, in means of machine learning (ML) algorithms for analysis of the health status of elderly people, results of medical treatments, and the likes.
- Democratization of the trade and finance.
I know these areas are too many for a single person, but I’m a stubborn Capricorn and don’t give up easily.
- E-mail: rumen.filkov(at)gmail.com
- Twitter: https://twitter.com/roumenf
- Facebook: https://www.facebook.com/rfilkov/
- LinkedIn: https://www.linkedin.com/in/rfilkov/
Important info regarding e-mail support requests (as of 20.March.2022):
- Please don’t contact me at the weekends (Friday to Sunday), or during the holidays. Please also don’t contact me, unless you get really stuck. I’m quite busy at my work, and need some rest at weekends.
- The e-mail contact above is reserved mainly for the customers of my commercial Unity assets. The students and academic staff may use it too, to request free academic copies of the assets.
- The customers of my commercial Unity assets are eligible for up to 3 free e-mail support requests. Please mention your invoice number in the e-mail.
- If you need additional support (after the free support requests), please purchase additional support request here and mention the invoice number in the e-mail.
- The customers of my free Unity assets and the users of free academic and personal copes are not eligible for e-mail support. Please research and experiment a bit more by yourselves.
37 thoughts on “About”
Hi Rumen, Can you check out your blog on Sunflower Oil treatment to answer a few questions? Thanks, Susan
Hi Susan, sorry for the delay. I’m on vacation right now and could not respond immediately. Best, Rumen.
Hello Rumen great job! I was wondering if this product
needs unity 3d Pro to run properly before buying it. Thanks!
Nope, and it never did. This is from the package description: “This package can be used in both Unity Pro and Unity Personal editors.”
Many thanks for the Kinect with MS-SDK package!
That being said, is there any chance that the KinectExtras with MsSDK package will come back to the asset store? If not, is there any other way to get it?
Thanks in advance!
I’ll think about this. It was fully integrated in the K2-asset, which works with Kinect v1 too. That’s why I deprecated it, in order not to support two equal packages. If you need to for K1-sensors only, you may send me an email request to get it from me, too.
Thanks for the swift response. In regards to both the KinectExtras and K2-asset, does any of them have basic functionality for audio capture and/or audio detection?
Thank you in advance for taking the time to answer this.
Both assets provide grammar speech recognition. Otherwise, audio API is available in the K2-asset, as provided by the MS-plugin.
Hello. Thank you very much for the Kinect with MS-SDK asset. Did you think about the Run gesture? Is it possible to add? I can make some payment for this.
I posted source code for Run-gesture somewhere in the forum some time ago. Please look it up.
Does any video offered for instruction? I can’t figure out yet unless purchased your assets
I don’t make any videos myself. Sorry, but I don’t have time and the artistic skills to do it.
Instead, I provide many tips & tricks here and asset documentation here.
i got the following error please help
error CS1061: Type `System.IO.FileInfo’ does not contain a definition for `Length’ and no extension method `Length’ of type `System.IO.FileInfo’ could be found (are you missing a using directive or an assembly reference?)
As I said, see https://rfilkov.com/2014/08/01/kinect-v2-with-ms-sdk/#ki
And please don’t post on several channels, and research a bit more first.
Pingback: Kinect v2 Examples with MS-SDK | RFilkov.com - Technology, Health and More
Hello. Thank you very much for the Kinect V2 with MS-SDK asset.
I have a problem with the asset. The Kinect V2 automatically opens and closes on and off when I use KinectController Object attached with KinectManager, BackgroundRemovalScript, and InteractionManager all together.
I changed other kinect v2 sensors, and other computers, but the problem happens.
Please tell me how to fix. I want to the kinect v2 sensor opens once when a Unity executable launches.
Hi, the sensor is controlled by the KinectManager. You need to have only one KinectManager in your game. Please look at the multi-scene demo and the respective pdf in _Readme-folder, to find out how to setup the KinectManager in a multi-scene environment.
Thank you for giving a comment, Mr. Rumen F.
The app I develop is just one scene. It’s not multi-scene environment.
I tested your asset and found out that if there’s a person or people, the kinect v2 sensor doesn’t close automatically. The kinect v2 sensor opens once.
The kinect v2 sensor close automatically when there is no person, Is this a specs of the asset?
Well, I’m a bit surprised. This asset has thousands of users and nobody has complained so far that the sensor closed up automatically, when there are no people in front of it. Are you testing the unaltered demo scenes, or your own scene? Also, the full KM source is there, so you can put some breakpoints in Awake() and OnDestroy() to check when (and why) they get called.
i purchased kinect v2 examples in asset store
i am sorry i sent wrong question before
exact question is,
what do i modify in “KinectHolographicViewe scener” if i want scene move to my position when i move to kinect direction.
it works good in horizontal and vertical, but scene moves to the opposite direction.in default depth
scene become far when i move to kinect direction in default.
i want scene become near when i move to kinect.
thanks a lot~
Yes, I think you are right. Please e-mail me and I’ll send you updated holographic camera script.
Hi, I purchased the Kinect MoCap Animator and am using it with only the Leap Motion. When I record, everything seems to work well, the screen updates saying it is recording and also stopped. But when I attempt to locate the mocap file, it does not exist. I am not sure what I am doing wrong. Thanks.
I found that it will not record if it doesn’t recognize a player in the kinect. I am trying to capture only the hand movements using the Leap Motion. I don’t need the entire body. I removed most of the joints in the unity inspector for capturing the motions. I can capture but when I attempt to apply it to a different model, Unity complains. The workflow is very difficult for the Leap Motion capture.
Hi there, the Kinect Mocap animator was not designed to record animations without Kinect-sensor 🙂 The LeapMotion finger tracking was only an addition to this. How ever. To make it record without user detected, you would need to modify the Update()-method of KinectFbxRecorder.cs. Try to comment out ‘if(userId != 0 &&…’ and make the next line ‘if(Time.time >= …’. Hope this helps in your case.
Thank you so much for the advice. I will play around with it.
I am able to capture the motion using both Kinect and Leap Motion. When I attempt to apply the mocap to any other model, I am always unsuccessful. I have tried models from Unity, from other source on the Internet and just now created a model using MakeHuman. I always get an error along these lines:
Copied Avatar Rig Configuration mis-match. Transform hierarchy does not match: Transform ‘Hips’ for human bone ‘Hips’ not found. I do set Animation Type as Humanoid.
How do I resolve this error or change the workflow such that everything matches? I have followed the instructions on your site to no avail. Thanks!
Just follow the steps here: https://ratemt.com/k2mocap/RetargettheAnimation.html
Both models need to be with Humanoid rigs, in order the Unity animation system to be able to retarget the animation(s) from one model to another.
Man, your are a legend, amazing assets, wonderful support and documentation, gouges work, you should be in NASA you know!
Thank you! My fields of interest are too far from NASA’s 😉
Your website is fantastic. I saw your kinect v2 asset in unity, does it work with steamVR plugin? I’m working on a project which integrates kinect and HTC Vive into unity (kinect captures human motion and maps in vive goggle as avatar motion), will the lighthouse of vive interfere with kinect camera? Thanks!
Hi Ken, to your question: honestly, I don’t know. I think lighthouse doesn’t interfere with Kinect, but I don’t have Vive and have never tried it, to be sure. If anyone reading this comment has experience with the matter, please comment.
Thanks Rumen, I’m testing about that.
It will make me have to pay, but I love you’re new email support standards.
If you are a legitimate customer of any of my commercial assets, feel free to contact me with short, specific questions, when you get really stuck. My problem with the support is that it literally “eats” my time for R&D. That’s why it should be put into some limits. Making people pay for support definitely lowers the pressure 😉
i have a question about establishing an interactive floor with your Kinect package and unity.
Basically I use the SkeletonColliderDemo Scene to create an Adaptation of the tracked person and let it interact with 3d Objects.
That works fine on a Screen, but now I have the challenge to bring it to an interactive floor with a beamer. Do you have some suggestions, how to achieve the “real world data – to screen ratio (dynamic size depending on the beamer distance)” relation?
I thought it would make sense to create something like an multiplier to “tune” the real world distance to screen distance, depending on the available floor dimensions – that would not be a perfect solution in my opinion. I could not find a solution how to translate the Kinect sensor data into smaller/larger units depending on the floor projection dimensions.
Have you ever created an interactive floor and some adivces?
Hi Thomas. What Kinect package exactly do you mean?
To your last question: No, I have not created any interactive floors myself, but would give you 1-2 advices:
1. In case of an interactive floor, I think your best fit would be the BlobDetectionDemo-scene, if the sensor and the beamer would be mounted on the ceiling, looking down.
2. If you mean the K2-asset and Kinect-v2 as sensor, please look at the KinectProjectorDemo-scene as well. In this regard, see this tip.
I can’t think of anything else at the moment.