-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Example tf tree with T265 and a mobile robot #749
Comments
I'm not sure if this will help your particular case, but a user of a robot based around a Duo3D camera published an image of their tf tree. Left-click on the image to view it in full size. |
Thanks @MartyG-RealSense! That's exactly the kind of tree I'd like to end up with, however I was unable to find the correct config. Here is the tree most resembling the above one: The issue with that one is that all the camera frames appear at the root of base_link. Any other advice would be highly appreciated! |
Intel's Phillip Schmidt is likely to be well equipped to give the answer you need, as he gave a seminar about using RealSense with wheeled robots a couple of months ago. I'll link him into this discussion to highlight it to him. @schmidtp1 In the meantime, a recording of Phillip's seminar is available on YouTube if you have not seen it already. https://www.intelrealsense.com/visual-navigation-in-robotics/ |
Thanks @MartyG-RealSense! I wasn't aware of this seminar, it was a very good watch! If you could share some launch files or configs for the cameras then I bet it will be a very good starting point. If @schmidtp1 can also give us some pointers we are more than happy to create a PR for the wikis/minimal launch file for a mobile robot setup. |
There was a discussion about T265 intrinsics, extrinsics and calibration over on the Intel Support website today. This includes links to an example program and a sample file for wheel odometry calibration. |
Hi @msadowski, I created a simple example assembly using the Kobuki robot base and T265 facing forward. You can adapt this launch file for your needs: To fuse wheel odometry, currently a json configuration file is still required: We are working on simplifying the usage. Thank you for trying it out and thank you for your patience. |
@schmidtp1 thanks for the information! It's really useful! Can you check if in your setup the base_frame_id argument is set to camera_link (that's what I would expect based on your tree)? Since in standard cases the offset between base_link and camera_link is always static I'm wondering if the tree would be a bit 'cleaner' if a tf transform between base_link and camera_link could be specified in the urdf and then the realsense_manager could perform a lookup. This way I imagine we could avoid a static_transform between camera_pose_frame and base_link. I also noticed there is another tree detached (odom->base_footprint), normally I'd expect the tree to be organized as (odom->base_footprint->base_link) but in case of your tree this would cause conflicts with the static transform. I assume this transform could be disable in the odometry node (I assume that T265 is only listening to the information on the odometry topic). Did you also happen to use this setup with some existing mapping packages? I'm wondering if there are some specific things that you needed to do to make map->camera_odom work reliably for localization. |
@schmidtp1 how do you transform the poses accrued by the camera frame to base_link. I am doing the same thing as you are (static transform from camera_pose_frame to base_link) but the resultant odometry and poses are still in an 'arc' when the robot rotates rather than an in-place rotation. Would you be kind enough to provide a complete tree with map->camera_odom_frame->camera_pose_frame->base_link transforms? This would come in handy when trying to implement the an ekf package. What would the "map" frame in your example attach to ideally, the camera_odom_frame or the camera_pose_frame? Thanks! |
With the transform configuration as mentioned above |
Hello! Just to preface this: I’m a complete beginner with both, ROS and the Realsense cameras. I have a mobile robot that I hope to use the t265 with. Essentially I want to add 4 other links to the tf tree: base_link, the two wheels and a laser sensor. Currently, I have a tf broadcaster node that publishes static tf transformations between these 4 links. The tf tree of which looks like this:
However, now I need to connect these static transformations that I have to the dynamic (is that what they’re called?) transformations provided by the t265. In other words, when the camera pose frame of the t265 changes its position, I want that reflected in the position of the base_link (and subsequently the three other links) using the static transformations that I've defined in the file above. So in order to do that, I followed the advice above and edited the rs_t265.launch file and added the following line: As you can see, they're not connected with one another. However, in some instances, I do manage to get a tree that is fully connected in a way that I want. But even in those instances, it's only the static transformations for base_link and the other base_* that are being broadcasted. Is there something I can do to change that? Furthermore, and this is once I manage to solve the problem above, I have the same question as @manomitbal above:
Thank you very much! |
@schmidtp1 just as @msadowski mentioned on may 3rd, your tree shows the odom->base_link tree un-connected to the rest of the tree from the camera. Can you explain how this should attach? I am currently using cartographer and RTaB map and I am struggling to find the right way to setup the tree. |
@schmidtp1 Does it mean the line arg name="topic_odom_in" default="$(arg tf_prefix)/odom_in" |
I have done what @schmidtp1 did, and built in the wheel odom to t265. Is there any way to solve this issue? |
Edit: Please scrap the below. It doesn't work as I expected! I think I'm really close to having the tf tree looking exactly as I would expect! The issue is that the rs_t265 pose is disconnected from rs_t265_link at the moment. The relevant launch file settings I used are:
If anyone has some feedback on this it would be highly appreciated! |
@msadowski I have this issue, can someone help me with this? |
@Ping-Ju apologies, I just realized that my setup wasn't working as I expected and updated my comment to indicate that. I think the best way to start is to go through the comments here: #711 It's quite weird that your sensor doesn't output any velocity at all. I'd check it with the realsense software (realsense-viewer) first to make sure you can see it there. |
Have you set your publish_odom_tf (odom -> base_link) to "false" from your robot (not the T265)? |
Thanks @msadowski , I think I have solved it, beucase the odom_tf from robot need to be close.. |
Can you please share how you setup the tf tree via code or your launch file. I have set the tf tree as show but it did not work as expected: the information of T265 still not coincide with origin of the robot (which is base_link) |
Hello yall, I have been having the exact same issue. I would love to have a nice TF tree with my own base link but something prevents that from working. I think what is happening is that somewhere in the RealSense code there is an explicit zero-transform (in place nothing changes) from [camera_pose_frame --> camera_link]. Whenever I attempt to move camera_link to my desired location [my_link --> camera_link] it reverts back to [camera_pose_frame --> camera_link]. |
Here is a short update from me. I'll try to describe it properly on my blog in the next week or two but hopefully it should get you started. @Enmar123 @ghuan4 you might find it useful. When I started working with T265 I wanted to use it as a source of odometry, providing a odom->base_link transform. With the way the realsense driver is setup I don't think you can setup the proper tf tree that would keep the structure similar to [odom->base_link->camera_link]. The reason for that the driver publishes the camera_pose_frame -> camera_link and this would cause the camera_link to have two parents (base_link and camera_pose_frame) which is not allowed. To go around these problems I ended up setting "publish_odom_tf" parameter to false and instead used robot_localization with a single source of odometry (t265 odometry topic). This way I have a solid tf tree, a word of advice though: for this setup to be fully REP-105 compliant you need to set enable_pose_jumping to false but this seem to cause other issues for me |
Hi again. I wanted to let you guys know about a workaround i found to this problem in case it helps. In essence what i did was create a fake base_link using a static transform from [camera_pose_frame --> fake_base_link] which would place the fake base link in the same relative position as my real base_link. I then grabbed the xyz coords of the fake base link using tf.lookupTransform(cam_odom_frame, fake_base_link) and chucked them into a new odometry message. I then filled out the rest of the odometry message with the covariance and the header. This let me use my new odom message with robot_localization and my other imu and encoder measurements. You do end up with two trees, but hey, it works i guess. I don't quite know how to do stuff here yet, so ill just attach a text file of my python script. |
@Enmar123 would you be able to make a screenshot of the tf tree? I would love to see it! |
@msadowski Sure thing! Here is my tf tree. I also put the code into a ros package so you can have a better idea of what i did. |
Hi Mateusz, Are these examples sufficient for your needs? |
@RealSenseCustomerSupport Yes, I think the examples given show nicely what is possible with the current state of the software. I hope that one day the driver will be refactored to publish odom->base_link transform using the camera odometry. |
Can someone verify if the frames in the URDF file of T265 here are correct? |
@msadowski Thanks man. That's 2021 and it's still a problem. In my case I wanted to fuse a wheeled odometry with a visual one from t265, but I experienced almost all mentioned above issues (i. e. broken TF tree). Inspired by your contribution in the field I place there how I was able to set things up:
And now it works. |
Hi @mbed92, I want to fuse wheel odometry with IMU data from d435i. I am want to try your solution, but what is the wheel_odometry topic. There is no wheel_odometry param in realsense2_camera node. |
Hey By wheel_odometry i meant the topic with nav_msgs/Odometry messages where you calculate the robot's pose and twist based on e.g. wheel encoders. That topic I passed to the t265 camera node, by the param topic_odom_in and then the node fuses both odometries (wheeled and visual). I guess that in your case you should use fuse algorithm, like e.g. EKF robot_localization package where you pass all the sources (wheeled odometry, IMU, etc.). You might look into exemplary configuration there. |
Hi!
We've been trying to implement a RealSense T265 on a wheeled robot for quite a while now and we've been wondering if anyone could share a tf_tree on a working mobile robot? If someone could share a launch file too that would be superb!
I've been trying to set the base_frame_id to base_link, however this moves all the camera related frames to the center of the robot (even though there is a static transform between base_link and camera frame).
Another thing that's not clear for me is how the pose_frame relates to odometry frame?
The text was updated successfully, but these errors were encountered: