-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Oculus Quest 2 and Realsense Software Device #9579
Comments
Hi @fwomack9 Have you seen the earlier project created for the original Oculus Quest that enabled use of the RealSense D435i and Unity with it? https://github.com/GeorgeAdamon/quest-realsense The RealSense user who developed that project also used RsContext, and additionally used C# scripting that made sure that Android Camera Permissions are explicitly requested from the user, if not provided already. |
Yes, that's what I followed to set permissions up. It works and finds the hardware device when I have it wired into the Quest, but the goal is currently to be completely wireless. The camera is wired to a different machine that is publishing frames on the ROSUnity TCP connection here. |
The recent history of networking a RealSense camera with ROS has been problematic, with very low transmission rates compared to a normal directly-connected camera. IntelRealSense/realsense-ros#1808 |
Yes, we've been working towards solutions to the speeds. However, the issue at the moment is getting the pipeline to recognize the software device. Everything works when it's not the quest even when no frames are being transmitted. Of course there's a waiting for frames time out error but there's no difficulty recognizing the software device or starting the pipeline. Everything works wirelessly when not on the quest 2. Thank you! |
Using ROSUnity TCP not on the Quest, I'm able to get up to 15 fps on a wired ethernet connection and 5 fps on a wireless network connection. There are no current issues with getting messages and frames to the quest 2. |
The Oculus Quest 2 operating system is based on Android 10. If the AAR interface of your project is based around the RealSense Android wrapper (as suggested by the instructions of the original Oculus Quest project) then this could be a potential cause of problems. This is because the RealSense Android wrapper has a known issue with being used with Android 10. A fix that has worked consistently is to set targetSdkVersion (the Android API version to target) to '27'. |
I rebuilt everything with the targetSdkVersion set to '27' and I'm still facing the same error as before. To recap, instead of a normal hardware device that's connected through the usb-c port on the oculus, the app uses the Intel Realsense's software device class to bundle frames received on a ROS image topic originally published by a realsense camera on a different machine. I get the camera intrinsics before the error shown above. Thank you! |
One reason why streams might be inaccessible is that another process could be accessing them first. The SDK has a set of rules called the Multi-Streaming Model that dictates if a particular stream on a particular camera is enabled ('claimed') by a process then another process cannot access that particular stream until the currently active stream is disabled and the claim on it released so another process can use the stream. Are there any processes in your system that could be accessing the streams before software_device is able to start them with its pipeline? |
In Unity there is no open method for the software device, is there a way to check if the streams are being accessed by another process? No other places in the code refer or access frames from the ActiveProfile until after it's started which it can't currently do. In the meantime, I get the same error in Unity that my Oculus quest raises when I change the realsense package in unity from 2.42 to 2.47 and work again when I switch back. Could I be missing a change in the correct way to implement the use of a software device? |
Have you tested the project with the unitypackage from SDK 2.42.0 to see whether it is an issue introduced since 2.42.0 that is causing the problem? |
Thanks @MartyG-RealSense for your continued help. The project works up until in the Oculus headset with 2.42 (as in pipeline starts, receives frames, and renders pointclouds in the Unity Player). I don't think that it's an issue that was introduced because in #9588 they get the software device to start up with 2.47 successfully. My implementation just doesn't yet so I'm wondering if it has to do with the order that I initialize and open everything. I will attach some of my code. `
The above code works with 2.42 and not with 2.47. elsewhere it is subscribing to Ros topics for the intrinsics before this is run. Neither work on the Oculus. |
Usually, if you have defined a custom stream configuration with cfg instructions then you have to tell the pipe start instruction about them or the stream definitions will be ignored. ActiveProfile = m_pipeline.Start(cfg); |
The two subjects of C# coding in general and software_device programming specifically are somewhat outside of my experience. A useful reference may be a commit for a C# example for using software device with alignment. It provides a demonstration of using cfg. 299112e#diff-1dede5898dd4841a427c6673f12fa807b0c9d56e07c3388948b57a700c8726e3R72 |
From my limited understanding, in that example they're able to start the pipeline with the configuration because they're populating their software device with frames from a realsense camera that is directly connected to the computer. I will try and rebuild the aar file again with 2.42.0 in case and update if I'm able to get the software device to work then. |
Okay, thanks very much. Good luck! |
Update. Building all with 2.42.0 and a targetSdkVersion set to '27' with network permissions still results in that "no streams found!" error on the oculus ONLY despite the software device being found. When I query for the device and its streams I'm able to verify that they exist. It confuses me why this would work in the Unity Window with 2.42 and not with 2.47. I've read through the following posts about rs_net devices and although the implementation is similar to the software device, I'm unable to enable the "device" from a configuration because there are no serial id's associated with software devices. I'll continue working on figuring out what makes the software streams suddenly undetectable when building to an apk for the Oculus. Thank you for any pointers anyone can provide. |
Thanks very much, @fwomack9 - good luck! |
BTW, as you are making use of sensor access I thought you might be interested that a new sensor API for Android called getProfile has been included in the new SDK 2.49.0 released today, along with an example Java program for it in the link below. https://github.com/IntelRealSense/librealsense/tree/master/wrappers/android/examples/sensor |
Hi @fwomack9 Do you require further assistance with this case, please? Thanks! |
Case closed due to no further comments received. |
Before opening a new issue, we wanted to provide you with some useful suggestions (Click "Preview" above for a better view):
All users are welcomed to report bugs, ask questions, suggest or request enhancements and generally feel free to open new issue, even if they haven't followed any of the suggestions above :)
Issue Description
Hello everybody,
I'm designing a VR solution in which the user views a robot arm and a real time point cloud. I am using an Intel RealSense D435i depth camera and an Oculus Quest 2, and I'm developing in Unity. For communication, I'm using the ROSUnity TCP connection to send frames, joint states, and controller positions over a shared network.
In order to get around not having a physically connected realsense camera, I use their software device and Syncer classes to bundle and raise Framesets. My VideoStreamProfiles are added to SoftwareSensors that belong to the software device instance. I've previously successfully built this solution for an HTC VIVE and am now putting it on the Quest 2.
I've built the realsense SDK (2.42) into an aar file that I've added to the plugins in the Realsense SDK Plugins folder. I also have and AndroidPermissions GameObject that instatiates an instance of the Java RsContext class. Everything compiles into an sdk and loads onto the Quest 2 just fine. The ROSUnityTCP connection works just fine and I am receiving messages. I'm able to get so far as to instantiate a context instance in my Software Device wrapper class, add the software device to the context, and instantiate a pipeline from that same context. When I query the context for devices, I find the software device, but unfortunately, when I attempt the pipeline's start method, it throws an error saying that "no streams are selected".
In the image of my VR debugging output above, you can see that the library is finding "1 Realsense Devices (mask 0xfe). There's no way to add streams directly to the software device and I cannot resolve a configuration with enabled streams with the pipeline.
The text was updated successfully, but these errors were encountered: