Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example tf tree with T265 and a mobile robot #749

Closed
msadowski opened this issue Apr 30, 2019 · 30 comments
Closed

Example tf tree with T265 and a mobile robot #749

msadowski opened this issue Apr 30, 2019 · 30 comments
Labels

Comments

@msadowski
Copy link

Hi!

We've been trying to implement a RealSense T265 on a wheeled robot for quite a while now and we've been wondering if anyone could share a tf_tree on a working mobile robot? If someone could share a launch file too that would be superb!

I've been trying to set the base_frame_id to base_link, however this moves all the camera related frames to the center of the robot (even though there is a static transform between base_link and camera frame).

Another thing that's not clear for me is how the pose_frame relates to odometry frame?

@MartyG-RealSense
Copy link
Collaborator

I'm not sure if this will help your particular case, but a user of a robot based around a Duo3D camera published an image of their tf tree. Left-click on the image to view it in full size.

https://answers.ros.org/upfiles/1553604436748443.jpg

@msadowski
Copy link
Author

Thanks @MartyG-RealSense! That's exactly the kind of tree I'd like to end up with, however I was unable to find the correct config. Here is the tree most resembling the above one:

image

The issue with that one is that all the camera frames appear at the root of base_link. Any other advice would be highly appreciated!

@MartyG-RealSense
Copy link
Collaborator

Intel's Phillip Schmidt is likely to be well equipped to give the answer you need, as he gave a seminar about using RealSense with wheeled robots a couple of months ago. I'll link him into this discussion to highlight it to him. @schmidtp1

In the meantime, a recording of Phillip's seminar is available on YouTube if you have not seen it already.

https://www.intelrealsense.com/visual-navigation-in-robotics/

@msadowski
Copy link
Author

Thanks @MartyG-RealSense! I wasn't aware of this seminar, it was a very good watch! If you could share some launch files or configs for the cameras then I bet it will be a very good starting point. If @schmidtp1 can also give us some pointers we are more than happy to create a PR for the wikis/minimal launch file for a mobile robot setup.

@MartyG-RealSense
Copy link
Collaborator

There was a discussion about T265 intrinsics, extrinsics and calibration over on the Intel Support website today. This includes links to an example program and a sample file for wheel odometry calibration.

https://forums.intel.com/s/question/0D50P00004K1IE9SAN/where-can-i-find-the-camera-calibration-file-for-realsense-t265?language=en_US

@schmidtp1 schmidtp1 added the T265 label Apr 30, 2019
@schmidtp1
Copy link
Contributor

Hi @msadowski, I created a simple example assembly using the Kobuki robot base and T265 facing forward.
kobuki_t265_2_rviz_screenshot_2019_05_02-09_34_49
This is the corresponding tf tree:
frames_kobuki_t265_2
I used the camera pose frame with respect to the camera odom frame (as estimated by T265) to attach it to the robot (base link):
static_transform_publisher -0.2 0 -0.1 0 0 0 camera_pose_frame base_link

You can adapt this launch file for your needs:
https://github.com/intel-ros/realsense/blob/development/realsense2_camera/launch/rs_t265.launch

To fuse wheel odometry, currently a json configuration file is still required:
https://github.com/intel-ros/realsense/blob/development/realsense2_camera/launch/rs_t265.launch#L9
The format is described here: https://github.com/IntelRealSense/librealsense/blob/master/doc/t265.md#appendix
and the odometry topic has to be mapped to this one: https://github.com/intel-ros/realsense/blob/development/realsense2_camera/launch/includes/nodelet.launch.xml#L74

We are working on simplifying the usage. Thank you for trying it out and thank you for your patience.

@msadowski
Copy link
Author

@schmidtp1 thanks for the information! It's really useful!

Can you check if in your setup the base_frame_id argument is set to camera_link (that's what I would expect based on your tree)?

Since in standard cases the offset between base_link and camera_link is always static I'm wondering if the tree would be a bit 'cleaner' if a tf transform between base_link and camera_link could be specified in the urdf and then the realsense_manager could perform a lookup. This way I imagine we could avoid a static_transform between camera_pose_frame and base_link.

I also noticed there is another tree detached (odom->base_footprint), normally I'd expect the tree to be organized as (odom->base_footprint->base_link) but in case of your tree this would cause conflicts with the static transform. I assume this transform could be disable in the odometry node (I assume that T265 is only listening to the information on the odometry topic).

Did you also happen to use this setup with some existing mapping packages? I'm wondering if there are some specific things that you needed to do to make map->camera_odom work reliably for localization.

@manomitbal
Copy link

manomitbal commented May 6, 2019

@schmidtp1 how do you transform the poses accrued by the camera frame to base_link. I am doing the same thing as you are (static transform from camera_pose_frame to base_link) but the resultant odometry and poses are still in an 'arc' when the robot rotates rather than an in-place rotation.

Would you be kind enough to provide a complete tree with map->camera_odom_frame->camera_pose_frame->base_link transforms? This would come in handy when trying to implement the an ekf package.

What would the "map" frame in your example attach to ideally, the camera_odom_frame or the camera_pose_frame?

Thanks!

@manomitbal
Copy link

With the transform configuration as mentioned above static_transform_publisher -0.2 0 -0.1 0 0 0 camera_pose_frame base_link do both velocities and poses published on "camera/odom/samples" topic also get transformed to base_link?

@shaimayshah
Copy link

Hello!

Just to preface this: I’m a complete beginner with both, ROS and the Realsense cameras.

I have a mobile robot that I hope to use the t265 with. Essentially I want to add 4 other links to the tf tree: base_link, the two wheels and a laser sensor. Currently, I have a tf broadcaster node that publishes static tf transformations between these 4 links. The tf tree of which looks like this:

frames1
The code is very simple (I got it from the navigation robot setup ROS tutorial).
broadcaster.sendTransform( tf::StampedTransform( tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0, 0, 0)), ros::Time::now(),"base_link", "base_laser")); broadcaster.sendTransform( tf::StampedTransform( tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0, 0, 0)), ros::Time::now(),"base_link", "base_rwheel")); broadcaster.sendTransform( tf::StampedTransform( tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0, 0, 0)), ros::Time::now(),"base_link", "base_lwheel"));
(Removed the exact transformations)

However, now I need to connect these static transformations that I have to the dynamic (is that what they’re called?) transformations provided by the t265. In other words, when the camera pose frame of the t265 changes its position, I want that reflected in the position of the base_link (and subsequently the three other links) using the static transformations that I've defined in the file above. So in order to do that, I followed the advice above and edited the rs_t265.launch file and added the following line:
<node pkg="tf" type="static_transform_publisher" name="t265_connector" args="0 0 0 0 0 0 0 camera_pose_frame base_link 10000"/>
This changes my tf tree to the following (when the tf broadcaster node and this launch file are launched together).

frames

As you can see, they're not connected with one another. However, in some instances, I do manage to get a tree that is fully connected in a way that I want. But even in those instances, it's only the static transformations for base_link and the other base_* that are being broadcasted. Is there something I can do to change that?

Furthermore, and this is once I manage to solve the problem above, I have the same question as @manomitbal above:

With the transform configuration as mentioned above static_transform_publisher -0.2 0 -0.1 0 0 0 camera_pose_frame base_link do both velocities and poses published on "camera/odom/samples" topic also get transformed to base_link?

Thank you very much!

@jmachuca77
Copy link

@schmidtp1 just as @msadowski mentioned on may 3rd, your tree shows the odom->base_link tree un-connected to the rest of the tree from the camera. Can you explain how this should attach? I am currently using cartographer and RTaB map and I am struggling to find the right way to setup the tree.

@imasoul2
Copy link

imasoul2 commented Aug 23, 2019

@schmidtp1
Hello, your set up for the calibration has been helpful but could you kindly elaborate more on this comment?
"the odometry topic has to be mapped to this one: https://github.com/intel-ros/realsense/blob/development/realsense2_camera/launch/includes/nodelet.launch.xml#L74"

Does it mean the line arg name="topic_odom_in" default="$(arg tf_prefix)/odom_in"
since tf_prefix is camera by default, do we need to change from /odom_in to /odom/sample?

@Ping-Ju
Copy link

Ping-Ju commented Aug 28, 2019

I have done what @schmidtp1 did, and built in the wheel odom to t265.
tf_tree
But the problem is when I opened rviz with tf, and the fix frame is camera_odom_frame
The result of base_link and base_laser which were flashing.
Screenshot from 2019-08-28 17-10-32
and
aa

Is there any way to solve this issue?

@msadowski
Copy link
Author

msadowski commented Aug 29, 2019

Edit: Please scrap the below. It doesn't work as I expected!

I think I'm really close to having the tf tree looking exactly as I would expect!

image

The issue is that the rs_t265 pose is disconnected from rs_t265_link at the moment. The relevant launch file settings I used are:

  <arg name="camera"              default="rs_t265"/>
  <arg name="tf_prefix"           default="$(arg camera)"/>
  <arg name="publish_odom_tf"     value="true"/>
  <arg name="odom_frame_id"         value="odom"/>
  <arg name="pose_frame_id"         value="base_link"/>

If anyone has some feedback on this it would be highly appreciated!

@Ping-Ju
Copy link

Ping-Ju commented Aug 30, 2019

@msadowski
I have set to what you have done there.
But the problem is when the robot starts moving,
The odometry (/rs_t265/odom/sample) from rs_t265 didn't have any velocity (twist.twist.linear).
I have built in my robot wheel odometry (/odom) and calibration file (/.../config/calibration_odometry.json)
and for unite_imu_method, I put "linear_interpolation"

I have this issue, can someone help me with this?

@msadowski
Copy link
Author

@Ping-Ju apologies, I just realized that my setup wasn't working as I expected and updated my comment to indicate that. I think the best way to start is to go through the comments here: #711

It's quite weird that your sensor doesn't output any velocity at all. I'd check it with the realsense software (realsense-viewer) first to make sure you can see it there.

@Ping-Ju
Copy link

Ping-Ju commented Sep 2, 2019

Have you set your publish_odom_tf (odom -> base_link) to "false" from your robot (not the T265)?

@Ping-Ju
Copy link

Ping-Ju commented Sep 3, 2019

Thanks @msadowski , I think I have solved it, beucase the odom_tf from robot need to be close..

@ghuan4
Copy link

ghuan4 commented Sep 11, 2019

Thanks @msadowski , I think I have solved it, beucase the odom_tf from robot need to be close..

Can you please share how you setup the tf tree via code or your launch file. I have set the tf tree as show but it did not work as expected: the information of T265 still not coincide with origin of the robot (which is base_link)

image

@Enmar123
Copy link

Hello yall, I have been having the exact same issue. I would love to have a nice TF tree with my own base link but something prevents that from working. I think what is happening is that somewhere in the RealSense code there is an explicit zero-transform (in place nothing changes) from [camera_pose_frame --> camera_link]. Whenever I attempt to move camera_link to my desired location [my_link --> camera_link] it reverts back to [camera_pose_frame --> camera_link].

@msadowski
Copy link
Author

Here is a short update from me. I'll try to describe it properly on my blog in the next week or two but hopefully it should get you started. @Enmar123 @ghuan4 you might find it useful.

When I started working with T265 I wanted to use it as a source of odometry, providing a odom->base_link transform. With the way the realsense driver is setup I don't think you can setup the proper tf tree that would keep the structure similar to [odom->base_link->camera_link]. The reason for that the driver publishes the camera_pose_frame -> camera_link and this would cause the camera_link to have two parents (base_link and camera_pose_frame) which is not allowed.

To go around these problems I ended up setting "publish_odom_tf" parameter to false and instead used robot_localization with a single source of odometry (t265 odometry topic). This way I have a solid tf tree, a word of advice though: for this setup to be fully REP-105 compliant you need to set enable_pose_jumping to false but this seem to cause other issues for me

@Enmar123
Copy link

Hi again. I wanted to let you guys know about a workaround i found to this problem in case it helps.

In essence what i did was create a fake base_link using a static transform from [camera_pose_frame --> fake_base_link] which would place the fake base link in the same relative position as my real base_link. I then grabbed the xyz coords of the fake base link using tf.lookupTransform(cam_odom_frame, fake_base_link) and chucked them into a new odometry message. I then filled out the rest of the odometry message with the covariance and the header.

This let me use my new odom message with robot_localization and my other imu and encoder measurements. You do end up with two trees, but hey, it works i guess.

I don't quite know how to do stuff here yet, so ill just attach a text file of my python script.

tf_odom_v4_share.txt

@msadowski
Copy link
Author

@Enmar123 would you be able to make a screenshot of the tf tree? I would love to see it!

@Enmar123
Copy link

Enmar123 commented Sep 25, 2019

@msadowski Sure thing! Here is my tf tree. I also put the code into a ros package so you can have a better idea of what i did.

functional_tree

functional_tree_zoom

@RealSenseCustomerSupport
Copy link
Collaborator


Hi Mateusz,

Are these examples sufficient for your needs?

@msadowski
Copy link
Author

@RealSenseCustomerSupport Yes, I think the examples given show nicely what is possible with the current state of the software. I hope that one day the driver will be refactored to publish odom->base_link transform using the camera odometry.

@saikishor
Copy link
Contributor

Can someone verify if the frames in the URDF file of T265 here are correct?
https://github.com/pal-robotics/realsense_gazebo_plugin/blob/kinetic-devel/urdf/t265.urdf.xacro

@mbed92
Copy link

mbed92 commented May 18, 2021

Here is a short update from me. I'll try to describe it properly on my blog in the next week or two but hopefully it should get you started. @Enmar123 @ghuan4 you might find it useful.

When I started working with T265 I wanted to use it as a source of odometry, providing a odom->base_link transform. With the way the realsense driver is setup I don't think you can setup the proper tf tree that would keep the structure similar to [odom->base_link->camera_link]. The reason for that the driver publishes the camera_pose_frame -> camera_link and this would cause the camera_link to have two parents (base_link and camera_pose_frame) which is not allowed.

To go around these problems I ended up setting "publish_odom_tf" parameter to false and instead used robot_localization with a single source of odometry (t265 odometry topic). This way I have a solid tf tree, a word of advice though: for this setup to be fully REP-105 compliant you need to set enable_pose_jumping to false but this seem to cause other issues for me

@msadowski Thanks man. That's 2021 and it's still a problem. In my case I wanted to fuse a wheeled odometry with a visual one from t265, but I experienced almost all mentioned above issues (i. e. broken TF tree). Inspired by your contribution in the field I place there how I was able to set things up:

  • provided calibration_odom.json to move the t265 localization to my base frame, aka. base_footprint - link
  • feed the realsense ros node with a wheel_odometry topic (also in base footprint, that's why we need the calibration file)
  • set publish_odom_tf arg to false
  • launch robot_localization ROS node with only one input and the following odom0_config:
odom0_config: [ false, false, false,   # x y z
                false, false, false,   # roll pitch yaw
                true, true, false,     # vx vy vz
                false, false, true,    # vroll vpitch vyaw
                false, false, false ]  # ax ay az

And now it works.

@BrooklynBoy21
Copy link

Hi @mbed92, I want to fuse wheel odometry with IMU data from d435i. I am want to try your solution, but what is the wheel_odometry topic. There is no wheel_odometry param in realsense2_camera node.

@mbed92
Copy link

mbed92 commented Jul 21, 2021

Hi @mbed92, I want to fuse wheel odometry with IMU data from d435i. I am want to try your solution, but what is the wheel_odometry topic. There is no wheel_odometry param in realsense2_camera node.

Hey

By wheel_odometry i meant the topic with nav_msgs/Odometry messages where you calculate the robot's pose and twist based on e.g. wheel encoders. That topic I passed to the t265 camera node, by the param topic_odom_in and then the node fuses both odometries (wheeled and visual).

I guess that in your case you should use fuse algorithm, like e.g. EKF robot_localization package where you pass all the sources (wheeled odometry, IMU, etc.). You might look into exemplary configuration there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests