You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to use the D435i to build a VIO with pkg robot_localization, rtabmap_odom, and imu_filter_madgwick. My current pipeline is:
an IMU filter node to get processed IMU data with orientation
a visual odometry node that uses stereo images to get odometry
an UKF node that fuses IMU and visual odom to get final VIO odom data
If I only launch the realsense ros2 wrapper, the tf tree is:
This is my current launch file:
#!/usr/bin/env python3importosfromament_index_python.packagesimportget_package_share_directoryfromlaunchimportLaunchDescriptionfromlaunch_ros.actionsimportNodefromlaunch_ros.substitutionsimportFindPackageSharefromlaunch.substitutionsimportPathJoinSubstitutionfromlaunch.actionsimportIncludeLaunchDescriptionfromlaunch.launch_description_sourcesimportPythonLaunchDescriptionSourcedefgenerate_launch_description():
ld=LaunchDescription()
# realsense launchrs_camera_node=Node(
package="realsense2_camera",
executable="realsense2_camera_node",
name="camera",
namespace="camera",
parameters=[{
"align_depth.enable": True,
"linear_accel_cov": 1.0,
"unite_imu_method": 2, # linear_interpolation"enable_rgbd": True,
"enable_sync": True,
"enable_color": True,
"enable_depth": True,
"enable_accel": True,
"enable_gyro": True,
"initial_reset": False,
"publish_tf": True,
"rgb_camera.color_profile":"640x480x30",
"depth_module.infra_profile":"640x480x30",
}]
)
# IMU filterimu_node=Node(
package='imu_filter_madgwick',
executable='imu_filter_madgwick_node',
name='imu_filter',
parameters=[{
"use_mag": False,
# "publish_tf": True,"publish_tf": True,
"world_frame": "enu",
"fixed_frame": "camera_link", # The parent frame to be used in publish_tf# "fixed_frame": "camera_imu_frame", # The parent frame to be used in publish_tf
}],
remappings=[
('/imu/data_raw', '/camera/camera/imu'),
('/imu/data', '/rtabmap/imu'),
]
)
# camera base to camera_link(fr1_optical_frame)# TODO: transformation from base_camera to camera_linkbase_camera_tf_camera_link=LaunchDescription([
Node(
package="tf2_ros",
executable="static_transform_publisher",
arguments= ["0" , "0", "1", "0", "0", "0" ,"base_camera", "camera_link"]
)
])
# TODO: measure the robot base_link to base_camerarobot_base_tf_base_camera=LaunchDescription([
Node(
package="tf2_ros",
executable="static_transform_publisher",
arguments= ["0.0" , "0.0", "0.0", "0.0", "0.0", "0.0" ,"base_link","base_camera"]
)
])
# rtabmap_odom node# http://wiki.ros.org/rtabmap_odom#stereo_odometryrtabmap_odom_node=Node(
package='rtabmap_odom', executable='stereo_odometry', name="rtab_stereo_odometry", output='screen',
parameters=[{
# 'frame_id':'base_link', 'frame_id':'camera_base',
'visual_odometry':True,
'icp_odometry':False,
'stereo':True,
'publish_tf': False, # let the ekf publish the odom frame, hence set to false for this node'subscribe_odom_info':True,
}],
remappings=[
('odom', '/stereo_vo'), # resulting odom publish to /stereo_vo
('left/image_rect', '/camera/camera/infra1/image_rect_raw'),
('left/camera_info', '/camera/camera/infra1/camera_info'),
('right/image_rect', '/camera/camera/infra2/image_rect_raw'),
('right/camera_info', '/camera/camera/infra2/camera_info')
]
)
# ukf node to get VIO# https://docs.ros.org/en/melodic/api/robot_localization/html/state_estimation_nodes.htmlfilter_odom_node=Node(
package="robot_localization",
executable="ukf_node",
name="ukf_filter_node",
parameters=[
{
"frequency": 30.0,
"odom0": "/stereo_vo",
"odom0_config": [True, True, True,
True, True, True,
True, True, True,
True, True, True,
True, True, True],
"odom0_relative": True,
"imu0": "/rtabmap/imu",
"imu0_config": [False, False, False,
True, True, True,
True, True, True,
True, True, True,
True, True, True, ],
"imu0_differential": True,
"imu0_relative": False,
"use_control": False
}
],
remappings=[
('/odometry/filtered', '/odom')
]
)
ld.add_action(rs_camera_node)
ld.add_action(imu_node)
ld.add_action(base_camera_tf_camera_link)
ld.add_action(robot_base_tf_base_camera)
ld.add_action(rtabmap_odom_node)
ld.add_action(filter_odom_node)
returnld
Questions
In the IMU filter node, should the parameter fixed_frame be camera_link or camera_imu_frame. The document said it should be the parent frame to be used. So, I guess this should be the camera_imu_frame since we filter the raw IMU data from the camera_imu_optical_frame, which has the parent frame of camera_imu_frame. But then I realized that the transformation between camera_imu_optical_frame and camera_link should always be the static tf. So I set the fixed_frame to camera_link. The tf tree is shown bellow. The new question comes: Now, the tf between camera_link and camera_imu_optical_frame is published by the filter node with imu_filter_madgwick pkg in avg 200hz. Does this TF take care of the transformations along the original chain which was camera_link --> camera_gyro_frame --> camera_imu_frame --> camera_imu_optical_frame?
It seems like the VIO results drift a lot. Currently, I check the results with RVIZ2 where I select the Fixed Frame as odom and use rviz_imu_plugin to visualize the filtered IMU data. Is this a proper way to check the odometry performance? Is there any better way to do it?
Thanks!
The text was updated successfully, but these errors were encountered:
based on your script setup, it should be camera_imu_frame. The coordinates described in "ROS2(Robot) vs Optical(Camera) Coordination Systems" section at https://github.com/IntelRealSense/realsense-ros/tree/ros2-development and transformation can be checked with TF in rviz2 as well as the /tf_static topic.
what kind of drift are you seeing? to isolate the issue, in addition, check the raw accel and gyro data comes out of the realsense sdk. Tools like realsense viewer and motion example https://github.com/IntelRealSense/librealsense/tree/development/examples/motion will be helpful, and also the raw data come out of accel and gyro ros topics.
Issue Description
I'm trying to use the D435i to build a VIO with pkg
robot_localization
,rtabmap_odom
, andimu_filter_madgwick
. My current pipeline is:If I only launch the realsense ros2 wrapper, the tf tree is:
This is my current launch file:
Questions
fixed_frame
becamera_link
orcamera_imu_frame
. The document said it should be the parent frame to be used. So, I guess this should be thecamera_imu_frame
since we filter the raw IMU data from thecamera_imu_optical_frame
, which has the parent frame ofcamera_imu_frame
. But then I realized that the transformation betweencamera_imu_optical_frame
andcamera_link
should always be the static tf. So I set thefixed_frame
tocamera_link
. The tf tree is shown bellow. The new question comes: Now, the tf betweencamera_link
andcamera_imu_optical_frame
is published by the filter node withimu_filter_madgwick
pkg in avg 200hz. Does this TF take care of the transformations along the original chain which wascamera_link --> camera_gyro_frame --> camera_imu_frame --> camera_imu_optical_frame
?Fixed Frame
asodom
and userviz_imu_plugin
to visualize the filtered IMU data. Is this a proper way to check the odometry performance? Is there any better way to do it?Thanks!
The text was updated successfully, but these errors were encountered: