About

Hi, my name is Rumen Filkov, or just Rumen. I’m an electrical engineer, MSc in computer science, stubborn entrepreneur and life-long learner. I’m the owner and principal consultant at RF Solutions. I work on my own R&D projects and consult companies that use them or utilize similar technologies.

The main part of my work and research is focused in the following areas:

    • Computer vision – utilization of AI models for depth estimation, body and hand tracking, object detection, semantic segmentation and the likes, providing spatial analysis and understanding.
    • Natural user interface (NUI) devices – integration of and interaction with depth sensors like Femto Bolt & Mega, Azure Kinect, Kinect-v2 & v1 or RealSense D400 series.
    • Virtual and augmented reality (VR/AR) wireless devices – HoloLens, Oculus Quest or Apple Vision Pro, powered by Unity-3D game engine.
    • User experience – analysis and improvement.

I know all this is a bit too much for a single person, but I’m a stubborn Capricorn and don’t give up easily.

Contact information:

Regarding support requests (from 5.February.2024):

  • Please don’t contact me on weekends (Friday to Sunday), or during holidays. Please also contact me, only if you get really stuck. I’m quite busy at work, and need a break on the weekends.
  • The e-mail contact above is reserved primarily for customers of my commercial Unity assets. Students and academic staff can also use it, to request free academic copies of the assets.
  • Customers of my commercial Unity assets are eligible for up to 3 free e-mail support requests. Please mention your invoice number (or order number) in the e-mail.
  • If you need additional short-term (days to weeks), middle-term (months) or long-term (more than a year) support, please contact me for more information.
  • Please note, the users of my free assets and those, who requested academic or personal copes of the commercial assets, are not eligible for e-mail support. Please research and experiment a bit more on your own.

 

 

 

37 thoughts on “About

  1. Hey Rumen,
    Many thanks for the Kinect with MS-SDK package!
    That being said, is there any chance that the KinectExtras with MsSDK package will come back to the asset store? If not, is there any other way to get it?

    Thanks in advance!

    • I’ll think about this. It was fully integrated in the K2-asset, which works with Kinect v1 too. That’s why I deprecated it, in order not to support two equal packages. If you need to for K1-sensors only, you may send me an email request to get it from me, too.

      • Thanks for the swift response. In regards to both the KinectExtras and K2-asset, does any of them have basic functionality for audio capture and/or audio detection?

        Thank you in advance for taking the time to answer this.

      • Both assets provide grammar speech recognition. Otherwise, audio API is available in the K2-asset, as provided by the MS-plugin.

  2. Hello. Thank you very much for the Kinect with MS-SDK asset. Did you think about the Run gesture? Is it possible to add? I can make some payment for this.

  3. i got the following error please help

    error CS1061: Type `System.IO.FileInfo’ does not contain a definition for `Length’ and no extension method `Length’ of type `System.IO.FileInfo’ could be found (are you missing a using directive or an assembly reference?)

  4. Pingback: Kinect v2 Examples with MS-SDK | RFilkov.com - Technology, Health and More

  5. Hello. Thank you very much for the Kinect V2 with MS-SDK asset.

    I have a problem with the asset. The Kinect V2 automatically opens and closes on and off when I use KinectController Object attached with KinectManager, BackgroundRemovalScript, and InteractionManager all together.

    I changed other kinect v2 sensors, and other computers, but the problem happens.

    Please tell me how to fix. I want to the kinect v2 sensor opens once when a Unity executable launches.

    • Hi, the sensor is controlled by the KinectManager. You need to have only one KinectManager in your game. Please look at the multi-scene demo and the respective pdf in _Readme-folder, to find out how to setup the KinectManager in a multi-scene environment.

      • Thank you for giving a comment, Mr. Rumen F.

        The app I develop is just one scene. It’s not multi-scene environment.

        I tested your asset and found out that if there’s a person or people, the kinect v2 sensor doesn’t close automatically. The kinect v2 sensor opens once.

        The kinect v2 sensor close automatically when there is no person, Is this a specs of the asset?

      • Well, I’m a bit surprised. This asset has thousands of users and nobody has complained so far that the sensor closed up automatically, when there are no people in front of it. Are you testing the unaltered demo scenes, or your own scene? Also, the full KM source is there, so you can put some breakpoints in Awake() and OnDestroy() to check when (and why) they get called.

  6. HI
    i purchased kinect v2 examples in asset store

    i am sorry i sent wrong question before

    exact question is,
    what do i modify in “KinectHolographicViewe scener” if i want scene move to my position when i move to kinect direction.

    it works good in horizontal and vertical, but scene moves to the opposite direction.in default depth

    in confirm,
    scene become far when i move to kinect direction in default.
    i want scene become near when i move to kinect.

    thanks a lot~

  7. Hi, I purchased the Kinect MoCap Animator and am using it with only the Leap Motion. When I record, everything seems to work well, the screen updates saying it is recording and also stopped. But when I attempt to locate the mocap file, it does not exist. I am not sure what I am doing wrong. Thanks.

    • I found that it will not record if it doesn’t recognize a player in the kinect. I am trying to capture only the hand movements using the Leap Motion. I don’t need the entire body. I removed most of the joints in the unity inspector for capturing the motions. I can capture but when I attempt to apply it to a different model, Unity complains. The workflow is very difficult for the Leap Motion capture.

      • Hi there, the Kinect Mocap animator was not designed to record animations without Kinect-sensor 🙂 The LeapMotion finger tracking was only an addition to this. How ever. To make it record without user detected, you would need to modify the Update()-method of KinectFbxRecorder.cs. Try to comment out ‘if(userId != 0 &&…’ and make the next line ‘if(Time.time >= …’. Hope this helps in your case.

  8. I am able to capture the motion using both Kinect and Leap Motion. When I attempt to apply the mocap to any other model, I am always unsuccessful. I have tried models from Unity, from other source on the Internet and just now created a model using MakeHuman. I always get an error along these lines:

    Copied Avatar Rig Configuration mis-match. Transform hierarchy does not match: Transform ‘Hips’ for human bone ‘Hips’ not found. I do set Animation Type as Humanoid.

    How do I resolve this error or change the workflow such that everything matches? I have followed the instructions on your site to no avail. Thanks!

  9. Man, your are a legend, amazing assets, wonderful support and documentation, gouges work, you should be in NASA you know!

  10. Hi Rumen,

    Your website is fantastic. I saw your kinect v2 asset in unity, does it work with steamVR plugin? I’m working on a project which integrates kinect and HTC Vive into unity (kinect captures human motion and maps in vive goggle as avatar motion), will the lighthouse of vive interfere with kinect camera? Thanks!

    • Hi Ken, to your question: honestly, I don’t know. I think lighthouse doesn’t interfere with Kinect, but I don’t have Vive and have never tried it, to be sure. If anyone reading this comment has experience with the matter, please comment.

    • If you are a legitimate customer of any of my commercial assets, feel free to contact me with short, specific questions, when you get really stuck. My problem with the support is that it literally “eats” my time for R&D. That’s why it should be put into some limits. Making people pay for support definitely lowers the pressure 😉

  11. Hello Rumen,

    i have a question about establishing an interactive floor with your Kinect package and unity.

    Basically I use the SkeletonColliderDemo Scene to create an Adaptation of the tracked person and let it interact with 3d Objects.
    That works fine on a Screen, but now I have the challenge to bring it to an interactive floor with a beamer. Do you have some suggestions, how to achieve the “real world data – to screen ratio (dynamic size depending on the beamer distance)” relation?

    I thought it would make sense to create something like an multiplier to “tune” the real world distance to screen distance, depending on the available floor dimensions – that would not be a perfect solution in my opinion. I could not find a solution how to translate the Kinect sensor data into smaller/larger units depending on the floor projection dimensions.

    Have you ever created an interactive floor and some adivces?

    Thx!

    Best regards,

    Thomas

    • Hi Thomas. What Kinect package exactly do you mean?
      To your last question: No, I have not created any interactive floors myself, but would give you 1-2 advices:
      1. In case of an interactive floor, I think your best fit would be the BlobDetectionDemo-scene, if the sensor and the beamer would be mounted on the ceiling, looking down.
      2. If you mean the K2-asset and Kinect-v2 as sensor, please look at the KinectProjectorDemo-scene as well. In this regard, see this tip.
      I can’t think of anything else at the moment.

Leave a Reply to joongkwangCancel reply