View my account

intel Realsense for unity - how to control a game with the given wrapper

Comments

13 comments

  • MartyG

    Hi Saadisaadi1  The old 2016 version of the RealSense compatibility plugin for Unity at the link below was best suited for integration of RealSense into games using face, hand and gesture control.  It works with the now retired RealSense SR300 camera model but is not compatible with modern RealSense cameras such as the 400 Series, unfortunately and was not designed for recent Unity versions.

    https://github.com/IntelRealSense/librealsense/issues/10114#issuecomment-1002456887

     

    The current version of the Unity compatibility wrapper of the RealSense SDK does not have similar control-input ability.

     

    In regard to external free options, a product called Nuitrack 3D that is compatible with modern RealSense cameras can track movement of body parts and has Unity integration.  It has an unlimited free trial which is limited to a 3 minute session.

     

    https://nuitrack.com/

    https://github.com/3DiVi/nuitrack-sdk/blob/master/doc/Unity_Basic.md

    1
    Comment actions Permalink
  • Saadisaadi1

    thank you MartyG , but what can I do with the unity wrapper right now , like how can I use it to control a game, what the built in functionality that you have right now?

    1
    Comment actions Permalink
  • MartyG

    The RealSense Unity wrapper in its current state can only display camera information and does not have mechanisms for interactions such as moving an object.

    1
    Comment actions Permalink
  • Saadisaadi1

    Ok MartyG that would prevent me from integrating it and using it with my already made games.

    So I want to ask about the sdk that work with opencv , what can I do with that?, I heared that I can use opencv with mediapipe to track hands and know the current gestures , can I use the realsense also to know the distance of the hand from the camera? how can I use it with opencv , and what the functionalities that I can get so I can use this to control a game?

    This is so important to know because that would dictate if I'm going to learn the API of both opencv mediapipe and your sdk for python and opencv

    1
    Comment actions Permalink
  • MartyG

    The Unity Asset Store has an OpenCV interface, though it is not inexpensive.

    https://assetstore.unity.com/packages/tools/integration/opencv-for-unity-21088

     

    Alternatively, as Unity uses C# code (and therefore so does the RealSense Unity wrapper) it is possible to interface with the C# wrapper of the RealSense SDK to access its functions from Unity.

    https://github.com/IntelRealSense/librealsense/issues/1570

    1
    Comment actions Permalink
  • Saadisaadi1

    MartyG In my last comment I was not talking about unity , I was talking about opencv and mediapipe in python , I would make games instead using pygame , can you please give me what functionality can I get ? I mean can I get the distance from the camera of each pixel in the image , like using this line : 

    imageRGB = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)

    this line gives an image for the rgb colors , does your sdk have functionality , that returns similar thing but distance texture , so I can for example from the code on the top after processing it with mediapipe get the x and y of a finger , and from your image get the z using x and y , so I know where is the finger in 3d and that way I can do some controling based on the depth of the hand and distance from camera 

    1
    Comment actions Permalink
  • MartyG

    The link below has a script that uses the SDK's Python language (pyrealsense2) and mediapipe to get the distance using X and Y.

    https://github.com/IntelRealSense/librealsense/issues/9088

    1
    Comment actions Permalink
  • Saadisaadi1

    thank you MartyG I will receive the camera in couple weeks , I will test the code and go back to you if there is something else to ask , for now thank you very much

    1
    Comment actions Permalink
  • MartyG

    You are very welcome.  Good luck!

    1
    Comment actions Permalink
  • Saadisaadi1

    HI MartyG , I have got back now with a realsense camera , I use the camera in python , use mediapipe to check where the hand is hten take the depth and send it to the game in unity using sockets , but also I want to use the same camera in unity to show the player in the game using the first unity example that you have (sending the whole frame from  python is a problem because it leads to low fps), but I have a problme that I can't use the camera in two different apps in the same time (python + unity) , is there a solution for that ?

    1
    Comment actions Permalink
  • MartyG

    Hi Saadisaadi1  The RealSense SDK operates on rules called the Multi-Streaming Model that dictate that two applications cannot simultaneously access the same stream on the same camera.  So if Unity started the depth stream first then the Python application, if it is launched secondly, cannot also use the depth stream. 

     

    The reverse is true too.  If the Python application is launched first and uses the depth stream then Unity cannot use the depth stream. 

    https://github.com/IntelRealSense/librealsense/blob/master/doc/rs400_support.md#multi-streaming-model

     

    You can though access a different stream on the same camera in two different applications.  So for example, you could display the RGB image in Unity if the Python application was not using RGB.

    1
    Comment actions Permalink
  • Saadisaadi1

    Hi MartyG , I wanted to try to make a game with only unity so I wanted to anlyze the texture of the rawimage that your example scene : texturedepthandtexturecolor put the frame data on , but the problem is that when I try to go over the texture I always get that the texture is white color , it seems from the depth shader that you have put the depth data into the red channel , but I'm having problem that all the pixels of the texture always have value of 1 to the red channel , do you have any example code that analyzes the data of the texture for example for getting max or min depth , I need an example for regular c# unity code , not shader because I want to use the max or min for controling the game not just only for rendering the image with colormap (like you do with shader)

    1
    Comment actions Permalink
  • MartyG

    The best way to achieve your goal may be to apply a post-processing filter in the RealSense Unity wrapper, such as the Threshold filter that allows you to define a minimum and maximum distance for rendering depth.  The link below provides advice about implementing post-processing filtering in the Unity wrapper.

    https://github.com/IntelRealSense/librealsense/issues/1898

    1
    Comment actions Permalink

Please sign in to leave a comment.