Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Oculus Quest 2 and Realsense Software Device #9579

Closed
fwomack9 opened this issue Aug 5, 2021 · 23 comments
Closed

Oculus Quest 2 and Realsense Software Device #9579

fwomack9 opened this issue Aug 5, 2021 · 23 comments

Comments

@fwomack9
Copy link

fwomack9 commented Aug 5, 2021

  • Before opening a new issue, we wanted to provide you with some useful suggestions (Click "Preview" above for a better view):

  • All users are welcomed to report bugs, ask questions, suggest or request enhancements and generally feel free to open new issue, even if they haven't followed any of the suggestions above :)


Required Info
Camera Model D435i
Firmware Version 5
Operating System & Version Android
Platform Oculus Quest 2
SDK Version 2.42
Language unity
Segment VR
Unity Version 2020

Issue Description

Hello everybody,

I'm designing a VR solution in which the user views a robot arm and a real time point cloud. I am using an Intel RealSense D435i depth camera and an Oculus Quest 2, and I'm developing in Unity. For communication, I'm using the ROSUnity TCP connection to send frames, joint states, and controller positions over a shared network.

In order to get around not having a physically connected realsense camera, I use their software device and Syncer classes to bundle and raise Framesets. My VideoStreamProfiles are added to SoftwareSensors that belong to the software device instance. I've previously successfully built this solution for an HTC VIVE and am now putting it on the Quest 2.

I've built the realsense SDK (2.42) into an aar file that I've added to the plugins in the Realsense SDK Plugins folder. I also have and AndroidPermissions GameObject that instatiates an instance of the Java RsContext class. Everything compiles into an sdk and loads onto the Quest 2 just fine. The ROSUnityTCP connection works just fine and I am receiving messages. I'm able to get so far as to instantiate a context instance in my Software Device wrapper class, add the software device to the context, and instantiate a pipeline from that same context. When I query the context for devices, I find the software device, but unfortunately, when I attempt the pipeline's start method, it throws an error saying that "no streams are selected".

Oculusdebugging

In the image of my VR debugging output above, you can see that the library is finding "1 Realsense Devices (mask 0xfe). There's no way to add streams directly to the software device and I cannot resolve a configuration with enabled streams with the pipeline.

@MartyG-RealSense
Copy link
Collaborator

Hi @fwomack9 Have you seen the earlier project created for the original Oculus Quest that enabled use of the RealSense D435i and Unity with it?

https://github.com/GeorgeAdamon/quest-realsense

The RealSense user who developed that project also used RsContext, and additionally used C# scripting that made sure that Android Camera Permissions are explicitly requested from the user, if not provided already.

https://github.com/GeorgeAdamon/quest-realsense#step-3-initializing-the-rscontext-java-class-from-unity

@fwomack9
Copy link
Author

fwomack9 commented Aug 5, 2021

Yes, that's what I followed to set permissions up. It works and finds the hardware device when I have it wired into the Quest, but the goal is currently to be completely wireless. The camera is wired to a different machine that is publishing frames on the ROSUnity TCP connection here.

@MartyG-RealSense
Copy link
Collaborator

The recent history of networking a RealSense camera with ROS has been problematic, with very low transmission rates compared to a normal directly-connected camera.

IntelRealSense/realsense-ros#1808
IntelRealSense/realsense-ros#2014

@fwomack9
Copy link
Author

fwomack9 commented Aug 5, 2021

Yes, we've been working towards solutions to the speeds. However, the issue at the moment is getting the pipeline to recognize the software device. Everything works when it's not the quest even when no frames are being transmitted. Of course there's a waiting for frames time out error but there's no difficulty recognizing the software device or starting the pipeline. Everything works wirelessly when not on the quest 2. Thank you!

@fwomack9
Copy link
Author

fwomack9 commented Aug 5, 2021

Using ROSUnity TCP not on the Quest, I'm able to get up to 15 fps on a wired ethernet connection and 5 fps on a wireless network connection. There are no current issues with getting messages and frames to the quest 2.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Aug 5, 2021

The Oculus Quest 2 operating system is based on Android 10. If the AAR interface of your project is based around the RealSense Android wrapper (as suggested by the instructions of the original Oculus Quest project) then this could be a potential cause of problems. This is because the RealSense Android wrapper has a known issue with being used with Android 10. A fix that has worked consistently is to set targetSdkVersion (the Android API version to target) to '27'.

image

@fwomack9
Copy link
Author

fwomack9 commented Aug 9, 2021

I rebuilt everything with the targetSdkVersion set to '27' and I'm still facing the same error as before. To recap, instead of a normal hardware device that's connected through the usb-c port on the oculus, the app uses the Intel Realsense's software device class to bundle frames received on a ROS image topic originally published by a realsense camera on a different machine. I get the camera intrinsics before the error shown above. Thank you!

@MartyG-RealSense
Copy link
Collaborator

One reason why streams might be inaccessible is that another process could be accessing them first. The SDK has a set of rules called the Multi-Streaming Model that dictates if a particular stream on a particular camera is enabled ('claimed') by a process then another process cannot access that particular stream until the currently active stream is disabled and the claim on it released so another process can use the stream.

https://github.com/IntelRealSense/librealsense/blob/master/doc/rs400_support.md#multi-streaming-model

Are there any processes in your system that could be accessing the streams before software_device is able to start them with its pipeline?

@fwomack9
Copy link
Author

fwomack9 commented Aug 9, 2021

In Unity there is no open method for the software device, is there a way to check if the streams are being accessed by another process? No other places in the code refer or access frames from the ActiveProfile until after it's started which it can't currently do. In the meantime, I get the same error in Unity that my Oculus quest raises when I change the realsense package in unity from 2.42 to 2.47 and work again when I switch back. Could I be missing a change in the correct way to implement the use of a software device?

image

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Aug 9, 2021

There is a case in the link below about a RealSense user whose Unity project uses software_device with a point cloud. It may provide useful insights for you.

#9588

You may also be interested in the subject of integrating RealSense rs-net networking into a Unity project.

#8347

@fwomack9
Copy link
Author

fwomack9 commented Aug 9, 2021

My point cloud shades correctly when I have it in RealSense 2.42 in the Unity and the HTC Vive using Steam VR. I have the same settings from debugging the same issue previously using a similar post you also helped with haha.
unity pointcloud

The networking isn't causing any current issues, just starting the Pipeline with with a SoftwareDevice instance as the only "connected" device when using the 2.47 Unity package. I've added the software device to the context and made a dummy configuration with enabled streams.

@MartyG-RealSense
Copy link
Collaborator

Have you tested the project with the unitypackage from SDK 2.42.0 to see whether it is an issue introduced since 2.42.0 that is causing the problem?

@fwomack9
Copy link
Author

fwomack9 commented Aug 9, 2021

Thanks @MartyG-RealSense for your continued help. The project works up until in the Oculus headset with 2.42 (as in pipeline starts, receives frames, and renders pointclouds in the Unity Player). I don't think that it's an issue that was introduced because in #9588 they get the software device to start up with 2.47 successfully. My implementation just doesn't yet so I'm wondering if it has to do with the order that I initialize and open everything. I will attach some of my code.

`
context = new Context();
var cfg = new Config();
sync = new Syncer();

    cfg.EnableStream(Stream.Depth, depthIntrinsics.width, depthIntrinsics.height, Format.Z16, 30); 
    cfg.EnableStream(Stream.Color, colorIntrinsics.width, colorIntrinsics.height, Format.Bgra8, 30); 
    processMode = ProcessMode.Multithread;
    software_device = new SoftwareDevice();
    software_device.AddTo(context);

    m_pipeline = new Pipeline(context);
    depthIntrinsics = depthIntrinsicsSubscriber.getIntrinsics();
    colorIntrinsics = colorIntrinsicsSubscriber.getIntrinsics();
    colorizer = new Colorizer();
    depth_sensor = software_device.AddSensor("Depth");
    depthProfile = depth_sensor.AddVideoStream(new SoftwareVideoStream
    {
        type = Stream.Depth,
        index = 0,
        uid = 100,
        width = depthIntrinsics.width,
        height = depthIntrinsics.height,
        fps = 5,
        bpp = 2, //mono16
        format = cameraType == rs ? Format.Z16 : Format.Z16, 
        intrinsics = depthIntrinsics //might have to dress this Intrinsics up as a diff type
    });
    Debug.Log(depth_sensor + " " + depth_sensor.StreamProfiles);
    depth_sensor.AddReadOnlyOption(Option.DepthUnits, 1.0f / 5000);
    color_sensor = software_device.AddSensor("Color");
    colorProfile = color_sensor.AddVideoStream(new SoftwareVideoStream
    {
        type = Stream.Color,
        index = 0,
        uid = 101,
        width = colorIntrinsics.width,
        height = colorIntrinsics.height,
        fps = cameraType == rs ? 30 : 5,
        bpp = cameraType == rs ? 3: 32, 
        format = cameraType == rs ? Format.Rgb8 : Format.Bgra8,
        intrinsics = colorIntrinsics
    });

    Debug.Log(color_sensor + " " + color_sensor.StreamProfiles);
    frame_number = 0;
    software_device.SetMatcher(Matchers.Default);
    Sensors = new SoftwareSensor[2];
    Sensors[0] = depth_sensor;
    Sensors[1] = color_sensor;
    Streams = new VideoStreamProfile[2];
    Streams[0] = depthProfile;
    Streams[1] = colorProfile;
    depth_sensor.AddReadOnlyOption(Option.DepthUnits, 0.001f /*replace if theres a diff standard*/); //0.001f
    depth_sensor.Open(depthProfile);

    color_sensor.Open(colorProfile);
    depth_sensor.Start(sync.SubmitFrame);
    color_sensor.Start(sync.SubmitFrame);
    ActiveProfile = m_pipeline.Start(); // !!!!!!!!!! errors out here with previously attached error images
    if (processMode == ProcessMode.Multithread)
    {
        stopEvent.Reset();
        worker = new Thread(WaitForFrames);
        worker.IsBackground = true;
        worker.Start();
    }
    StartCoroutine(WaitAndStart());`

The above code works with 2.42 and not with 2.47. elsewhere it is subscribing to Ros topics for the intrinsics before this is run. Neither work on the Oculus.

@MartyG-RealSense
Copy link
Collaborator

Usually, if you have defined a custom stream configuration with cfg instructions then you have to tell the pipe start instruction about them or the stream definitions will be ignored.

ActiveProfile = m_pipeline.Start(cfg);

@fwomack9
Copy link
Author

fwomack9 commented Aug 9, 2021

If I start the pipeline with the cfg, it hasn't ever been able to find the software device even in 2.42.0.
unity error

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Aug 9, 2021

The two subjects of C# coding in general and software_device programming specifically are somewhat outside of my experience. A useful reference may be a commit for a C# example for using software device with alignment. It provides a demonstration of using cfg.

299112e

299112e#diff-1dede5898dd4841a427c6673f12fa807b0c9d56e07c3388948b57a700c8726e3R72

@fwomack9
Copy link
Author

fwomack9 commented Aug 9, 2021

From my limited understanding, in that example they're able to start the pipeline with the configuration because they're populating their software device with frames from a realsense camera that is directly connected to the computer. I will try and rebuild the aar file again with 2.42.0 in case and update if I'm able to get the software device to work then.

@MartyG-RealSense
Copy link
Collaborator

Okay, thanks very much. Good luck!

@fwomack9
Copy link
Author

Update. Building all with 2.42.0 and a targetSdkVersion set to '27' with network permissions still results in that "no streams found!" error on the oculus ONLY despite the software device being found. When I query for the device and its streams I'm able to verify that they exist.

It confuses me why this would work in the Unity Window with 2.42 and not with 2.47. I've read through the following posts about rs_net devices and although the implementation is similar to the software device, I'm unable to enable the "device" from a configuration because there are no serial id's associated with software devices.

#8423
#6376
#9376

I'll continue working on figuring out what makes the software streams suddenly undetectable when building to an apk for the Oculus. Thank you for any pointers anyone can provide.

@MartyG-RealSense
Copy link
Collaborator

Thanks very much, @fwomack9 - good luck!

@MartyG-RealSense
Copy link
Collaborator

BTW, as you are making use of sensor access I thought you might be interested that a new sensor API for Android called getProfile has been included in the new SDK 2.49.0 released today, along with an example Java program for it in the link below.

https://github.com/IntelRealSense/librealsense/tree/master/wrappers/android/examples/sensor

@MartyG-RealSense
Copy link
Collaborator

Hi @fwomack9 Do you require further assistance with this case, please? Thanks!

@MartyG-RealSense
Copy link
Collaborator

Case closed due to no further comments received.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants