-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
D435: Max 7 fps when streaming /color/image_raw
and /depth/color/points
#2014
Comments
Hi @admantium-sg In the link below, a RealSense ROS user tried converting the SDK's open-source ethernet networking system to use ROS networking protocols and had similar performance limitations with their project's Raspberry Pi 4. They said "I only get an average of 2Hz when I run rostopic hz on the camera/depth/image_rect_raw, and 10Hz on camera/color/image_rect_raw". There was no solution achieved at the conclusion of that particular case, unfortunately. |
Hi @admantium-sg Do you require further assistance with this case, please? Thanks! |
Case closed due to no further comments received. |
I find myself trying to use the same device for the job and I'm having similar issue: when using rs_rgbd.launch the fps are extremely low. The same doesn't happen when using rs_camera.launch. After investigating the issue it seems to be related to the fact that rs_camera.launch solely launches the nodes relative to the streams of the camera, that is depth camera and color camera whilst rs_rgbd.launch also launches some processing nodelets that provide streams that are aligned in depth and color, compensating for the parallax etc. |
Hi @DavidePatria If you are generating a point cloud like @admantium-sg was, I wonder whether a viable alternative solution to using rs_rgbd.launch would be to add ordered_pc:=true to an rs_camera roslaunch instruction to generate an ordered point cloud instead of the unordered cloud that is the default for rs_camera. For example: roslaunch realsense2_camera rs_camera.launch filters:=pointcloud ordered_pc:=true |
I'm actually working with /camera/image_raw (or similar name) and camera/aligned_color_with_depth/image_raw (or similar) so at the momento I'm not using the pointcloud. I would use this setup to register a rosbag to later use it for slam purposes |
If it was not compulsory to use ROS to obtain a bag file then Intel's open-source ethernet networking white paper may fit your description of using Pi 4 just for capture and accessing and processing the data remotely in real-time on a more powerful central computer. |
I'll definitely look into that, thank you. Another solution would be to make the process use all the available threads at the maximum speed, since I've noticed that only one is maxed out and the others are not fully used, but I guess this one is a bit more complicated. |
A RealSense ROS user at #1808 did manage to convert the Pi 4 system in the white paper to use ROS network protocols instead of the RTSP protocol in the paper but found that it performed slow compared to the original system in the white paper, with a rate of 10Hz compared to 30 FPS in the original system. The librealsense SDK can be built from source code to take advantage of multiple processor cores for color conversion and depth-color alignment by setting the CMake build term BUILD_WITH_OPENMP to True, at the cost of increased processor usage. |
My goal is to use the D435 with ROS1 on an headless RasperryPi4B to stream image data,
/color/image_raw
and depth data,/depth/color/points
as well asdepth/image_rect_raw
, via WLAN to a Linux Workstation.The concrete versions are:
Starting point to run ros-realsense on the Raspberry Pi4B is the default command:
Which I systematically evolved to
Systematically setting these parameters, and measuring the performance with
rostopic hz
from my Linux workstation, I came to the following results:So, to summarize:
A network ping between the Raspberry Pi4B and the Linux workstation is 2ms.
Why is the performance so slow? Is it not possible to stream with 30fps in ROS1?
The text was updated successfully, but these errors were encountered: