Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

D435: Max 7 fps when streaming /color/image_raw and /depth/color/points #2014

Closed
admantium-sg opened this issue Aug 1, 2021 · 9 comments
Closed
Labels

Comments

@admantium-sg
Copy link

admantium-sg commented Aug 1, 2021

My goal is to use the D435 with ROS1 on an headless RasperryPi4B to stream image data, /color/image_raw and depth data, /depth/color/points as well as depth/image_rect_raw, via WLAN to a Linux Workstation.

The concrete versions are:

  • Raspberry Pi4B
  • Ubuntu 20.04 server (headless)
  • ROS1 Noetic (installed as Debian package)
  • librealsense 2.0.48 (installed from source)
  • ros-realsense 2.3.1 (installed from source)

Starting point to run ros-realsense on the Raspberry Pi4B is the default command:

roslaunch realsense2_camera rs_camera.launch camera:=camera1

Which I systematically evolved to

roslaunch realsense2_camera rs_camera.launch camera:=camera1 depth_width:=640 color_width:=640 depth_height:=480 color_height:=480 depth_fps:=6 color_fps:=6 pointcloud_texture_stream:=RS2_STREAM_COLOR enable_sync:=false align_depth:=false initial_reset:=false filters:=pointcloud

Systematically setting these parameters, and measuring the performance with rostopic hz from my Linux workstation, I came to the following results:

Parameter
depth&color width not set not set 640 640 640 640 640 640
depth&color height not set not set 480 480 480 480 480 480
depth_fps not set not set 5 5 5 6 6 6
color_fps not set not set 5 5 5 6 6 6
initial_reset TRUE TRUE TRUE TRUE TRUE TRUE TRUE
enable_sync TRUE
align_depth
filters pointcloud pointcloud pointcloud pointcloud pointcloud pointcloud pointcloud pointcloud
texture_stream any color any any color color color color
ordered_pc yes
Topic HZ
------------------ ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
/color/image_raw 6.5 no data 1.5 1.5 no data 6 6 6
/depth/color/points no data no data no data no data no data 3 1 3
/depth/image-rect 7 6.5 no data no data 6 6 6 6

So, to summarize:

  • I can not get more than 7 fps for any stream.
  • I can only get pointcloud data when manually setting 6 fps

A network ping between the Raspberry Pi4B and the Linux workstation is 2ms.

Why is the performance so slow? Is it not possible to stream with 30fps in ROS1?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Aug 1, 2021

Hi @admantium-sg In the link below, a RealSense ROS user tried converting the SDK's open-source ethernet networking system to use ROS networking protocols and had similar performance limitations with their project's Raspberry Pi 4. They said "I only get an average of 2Hz when I run rostopic hz on the camera/depth/image_rect_raw, and 10Hz on camera/color/image_rect_raw".

#1808

There was no solution achieved at the conclusion of that particular case, unfortunately.

@MartyG-RealSense
Copy link
Collaborator

Hi @admantium-sg Do you require further assistance with this case, please? Thanks!

@MartyG-RealSense
Copy link
Collaborator

Case closed due to no further comments received.

@DavidePatria
Copy link

I find myself trying to use the same device for the job and I'm having similar issue: when using rs_rgbd.launch the fps are extremely low. The same doesn't happen when using rs_camera.launch. After investigating the issue it seems to be related to the fact that rs_camera.launch solely launches the nodes relative to the streams of the camera, that is depth camera and color camera whilst rs_rgbd.launch also launches some processing nodelets that provide streams that are aligned in depth and color, compensating for the parallax etc.
The thing is that the raspberry cannot keep up with the heavy computations required by these tasks since there are no other limitations that come into play (usb has enough bandwidth).
Using the raspberry would solve many of my problems, so I thought that if it was possible to somehow generate the processed streams asynchronously (on another machine maybe) through the use of rosbags and such the stream could be recorded with a decent framerate and the raspberry could basically act solely as a recording device for the d435i.
Any help is greatly appreciated.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Oct 21, 2021

Hi @DavidePatria If you are generating a point cloud like @admantium-sg was, I wonder whether a viable alternative solution to using rs_rgbd.launch would be to add ordered_pc:=true to an rs_camera roslaunch instruction to generate an ordered point cloud instead of the unordered cloud that is the default for rs_camera. For example:

roslaunch realsense2_camera rs_camera.launch filters:=pointcloud ordered_pc:=true

@DavidePatria
Copy link

I'm actually working with /camera/image_raw (or similar name) and camera/aligned_color_with_depth/image_raw (or similar) so at the momento I'm not using the pointcloud. I would use this setup to register a rosbag to later use it for slam purposes

@MartyG-RealSense
Copy link
Collaborator

If it was not compulsory to use ROS to obtain a bag file then Intel's open-source ethernet networking white paper may fit your description of using Pi 4 just for capture and accessing and processing the data remotely in real-time on a more powerful central computer.

https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras

@DavidePatria
Copy link

I'll definitely look into that, thank you.
Anyway, the ability to use it has I described would make things much easier, given that the file formats (slam and real sense) are probably not compatible and only shared ros as a common platform. The slam I'm using is openvslam, which allows for the usage of mp4 files, but doesn't mention anything about files containing depth, so I'm afraid of compatibility therefore decided to use ros, that is anyway used for my project, to exchange data.

Another solution would be to make the process use all the available threads at the maximum speed, since I've noticed that only one is maxed out and the others are not fully used, but I guess this one is a bit more complicated.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Oct 21, 2021

A RealSense ROS user at #1808 did manage to convert the Pi 4 system in the white paper to use ROS network protocols instead of the RTSP protocol in the paper but found that it performed slow compared to the original system in the white paper, with a rate of 10Hz compared to 30 FPS in the original system.

The librealsense SDK can be built from source code to take advantage of multiple processor cores for color conversion and depth-color alignment by setting the CMake build term BUILD_WITH_OPENMP to True, at the cost of increased processor usage.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants