diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/calibrating-sensors.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/calibrating-sensors.md index ec3c1bd2b72..55224194682 100644 --- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/calibrating-sensors.md +++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/calibrating-sensors.md @@ -2,9 +2,46 @@ ## Overview -Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack. These sensors must be calibrated correctly, and their positions must be defined using either urdf files (as in [sample_sensor_kit](https://github.com/autowarefoundation/sample_sensor_kit_launch/tree/main/sample_sensor_kit_description)) or as tf launch files. +Autoware expects to have multiple sensors attached to the vehicle as input to perception, +localization, +and planning stack. +These sensors must be calibrated correctly, +and their positions must be defined at `sensor_kit_description` and `individual_params` packages. In order to do that, we will use TIER IV' [CalibrationTools](https://github.com/tier4/CalibrationTools) repository. +## Setting of sensor_kit_base_link position according to the base_link + +In previous section (creating the vehicle and sensor model), +we mentioned about `sensors_calibration.yaml`. +This file stores `sensor_kit_base_link` (child frame) position and orientation according to the +`base_link` (parent frame). +We need to update this relative position +(all values were initially set equal to zero when file is created) +with using CAD data of our vehicle. + +
+ ![ackermann_link](images/tutorial_vehicle_sensor_kit_base_link.png){ align=center } +
+ Our tutorial_vehicle base_link to sensor_kit_base_link transformation. +
+
+ +So, our `sensors_calibration.yaml` file for our tutorial_vehicle should be like this: + +```yaml +base_link: + sensor_kit_base_link: + x: 1.600000 # meter + y: 0.0 + z: 1.421595 # 1.151595m + 0.270m + roll: 0.0 + pitch: 0.0 + yaw: 0.0 +``` + +You need to update this transformation value according to the `sensor_kit_base_link` frame. +You can also use CAD values for GNSS/INS and IMU position in `sensor_kit_calibration.yaml` file. + ## Installing TIER IV's CalibrationTools repositories on autoware After completing previous steps (creating your own autoware, @@ -34,8 +71,48 @@ for calibrating different sensor pairs such as lidar-lidar, camera-lidar, ground-lidar etc. In order to calibrate our sensors, we will modify `extrinsic_calibration_package` for our sensor kit. +For tutorial_vehicle, +completed launch files when created following tutorial sections can be found [here](https://github.com/leo-drive/calibration_tools_tutorial_vehicle/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/tutorial_vehicle_sensor_kit). + - [Manual Calibration](./manual-calibration.md) - [Lidar-Lidar Calibration](./lidar-lidar-calibration.md) - [Ground Plane-Lidar Calibration](./ground-lidar-calibration.md) - [Intrinsic Camera Calibration](./intrinsic-camera-calibration.md) - [Lidar-Camera Calibration](./lidar-camera-calibration.md) + +## Other packages you can check out + +### Camera calibration + +#### Intrinsic Calibration + +- Navigation2 provides a [good tutorial for camera internal calibration](https://navigation.ros.org/tutorials/docs/camera_calibration.html). +- [AutoCore](https://autocore.ai/) provides a [light-weight tool](https://github.com/autocore-ai/calibration_tools/tree/main/camera_intrinsic_calib). + +### Lidar-lidar calibration + +#### Lidar-Lidar Calibration tool from Autocore + +[LL-Calib on GitHub](https://github.com/autocore-ai/calibration_tools/tree/main/lidar-lidar-calib), provided by [AutoCore](https://autocore.ai/), is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration. It's based on local mapping and "GICP" method to derive the relation between main and sub lidar. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the above link. + +### Lidar-camera calibration + +Developed by MathWorks, The Lidar Camera Calibrator app enables you to interactively estimate the rigid transformation between a lidar sensor and a camera. + + + +SensorsCalibration toolbox v0.1: One more open source method for Lidar-camera calibration. +This is a project for LiDAR to camera calibration,including automatic calibration and manual calibration + + + +Developed by [AutoCore](https://autocore.ai/), an easy-to-use lightweight toolkit for Lidar-camera-calibration is proposed. Only in three steps, a fully automatic calibration will be done. + + + +### Lidar-IMU calibration + +Developed by [APRIL Lab](https://github.com/APRIL-ZJU) at Zhejiang University in China, the LI-Calib calibration tool is a toolkit for calibrating the 6DoF rigid transformation and the time offset between a 3D LiDAR and an IMU, based on continuous-time batch optimization. +IMU-based cost and LiDAR point-to-surfel (surfel = surface element) distance are minimized jointly, which renders the calibration problem well-constrained in general scenarios. + +[AutoCore](https://autocore.ai/) has forked the original LI-Calib tool and overwritten the Lidar input for more general usage. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the [LI-Calib fork on GitHub](https://github.com/autocore-ai/calibration_tools/tree/main/li_calib). diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/ground-lidar-calibration.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/ground-lidar-calibration.md index 42b8340c039..15030e36579 100644 --- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/ground-lidar-calibration.md +++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/ground-lidar-calibration.md @@ -46,7 +46,7 @@ The created `ground_plane.launch.xml` and `ground_plane_sensor_kit.launch.xml` a provided from TIER IV. Then we will continue with adding vehicle_id and sensor model names to the `ground_plane.launch.xml`. -(Optionally, values are not important. These parameters Overrode from launch argument) +(Optionally, values are not important. These parameters will be overridden by launch arguments) ```diff + @@ -101,7 +101,7 @@ The final version of the file (ground_plane.launch.xml) for tutorial_vehicle sho After the completing of ground_plane.launch.xml file, we will be ready to implement ground_plane_sensor_kit.launch.xml for the own sensor model. -Optionally, (don't forget, these parameters overrode by ROS 2 launch arguments.) +Optionally, (don't forget, these parameters will be overridden by launch arguments.) you can modify sensor_kit and vehicle_id as `ground_plane.launch.xml`over this xml snippet: (You can change rviz_profile path after the saving rviz config as video which included at the end of the page) @@ -316,5 +316,5 @@ you can see the sensor_kit_calibration.yaml in your $HOME directory after the ca | :--------------------------------------------------------: | :-------------------------------------------------------------: | | ![before-ground-plane.png](images/before-ground-plane.png) | ![images/after-ground-plane.png](images/after-ground-plane.png) | -Here is the video for demonstrating a ground plane - lidar calibration process on tutorial_vehicle: +Here is the video for demonstrating the ground plane - lidar calibration process on tutorial_vehicle: ![type:video](https://youtube.com/embed/EqaF1fufjUc) diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/chess-board-7x7.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/chess-board-7x7.png new file mode 100644 index 00000000000..fd28cf70edb Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/chess-board-7x7.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-data-collecting.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-data-collecting.png new file mode 100644 index 00000000000..e4772a312ce Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-data-collecting.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-initial-configuration.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-initial-configuration.png new file mode 100644 index 00000000000..0ef3507f98c Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-initial-configuration.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-topic-information.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-topic-information.png new file mode 100644 index 00000000000..ead4182953a Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/intrinsic-topic-information.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/tutorial_vehicle_sensor_kit_base_link.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/tutorial_vehicle_sensor_kit_base_link.png new file mode 100644 index 00000000000..943851c694c Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/tutorial_vehicle_sensor_kit_base_link.png differ diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/intrinsic-camera-calibration.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/intrinsic-camera-calibration.md index ec7184ac317..a65f7b448a9 100644 --- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/intrinsic-camera-calibration.md +++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/intrinsic-camera-calibration.md @@ -8,10 +8,111 @@ These parameters include focal length, optical center, and lens distortion coeff In order to perform camera Intrinsic calibration, we will use TIER IV's [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) tool. First of all, we need a calibration board which can be dot, chess or apriltag grid board. -In this tutorial, we will use a 7x7 chess board consisting of 7 cm squares. +In this tutorial, we will use this 7x7 chess board consisting of 7 cm squares: + +
+ ![chess_board](images/chess-board-7x7.png){ width="500"} +
+ Our 7x7 calibration chess board for this tutorial section. +
+
Here are some calibration board samples from [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) page: - Chess boards ([6x8 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/checkerboard_8x6.pdf)) - Circle dot boards ([6x8 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/circle_8x6.pdf)) - Apriltag grid board ([3x4 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/apriltag_grid_3x4.pdf)) + +If you want to use bag file for a calibration process, +the bag file must include `image_raw` topic of your camera sensor, +but you can perform calibration with real time. +(recommended) + +??? note "ROS 2 Bag example for intrinsic camera calibration process" + + ```sh + + Files: rosbag2_2023_09_18-16_19_08_0.db3 + Bag size: 12.0 GiB + Storage id: sqlite3 + Duration: 135.968s + Start: Sep 18 2023 16:19:08.966 (1695043148.966) + End: Sep 18 2023 16:21:24.934 (1695043284.934) + Messages: 4122 + Topic information: Topic: /sensing/camera/camera0/image_raw | Type: sensor_msgs/msg/Image | Count: 2061 | Serialization Format: cdr + + ``` + +## Intrinsic camera calibration + +Unlike other calibration packages in our tutorials, +this package does not need to create an initialization file. +So we can start with launching intrinsic calibrator package. + +```bash +cd +source install/setup.bash +``` + +After that, we will launch intrinsic calibrator: + +```bash +ros2 launch intrinsic_camera_calibrator calibrator.launch.xml +``` + +Then, initial configuration and camera intrinsic calibration panels will show up. +We set initial configurations for our calibration. + +
+ ![chess_board](images/intrinsic-initial-configuration.png){ width="200"} +
+ Initial configuration panel +
+
+ +We set our image source (it can be "ROS topic", "ROS bag" or "Image files") +from the source options section. +We will calibrate our camera with "ROS topic" source. +After selecting an image source from this panel, we need to configure "Board options" as well. +The calibration board can be "Chess board", "Dot board" or "Apriltag". +Also, we need to select board parameters, +to do that, click the "Board parameters" button and set row, column, and cell size. + +After the setting of image source and board parameters, we are ready for the calibration process. +Please click the start button, you will see "Topic configuration" panel. +Please select the appropriate camera raw topic for the calibration process. + +
+ ![chess_board](images/intrinsic-topic-information.png){ width="200"} +
+ Topic configuration panel +
+
+ +Then you are ready to calibration. +Please collect data with different X-Y axis, sizes and skews. +You can see your collected data statistics with the clicking view data collection statistics. +For more information, +please refer to [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) page. + +
+ ![chess_board](images/intrinsic-data-collecting.png){ align=center} +
+ Intrinsic calibrator interface +
+
+ +After the data collection is completed, +you can click the "Calibrate" button to perform the calibration process. +After the calibration is completed, +you will see data visualization for the calibration result statistics. +You can observe your calibration results with changing "Image view type "Source unrectified" +to "Source rectified". +If your calibration is successful (there should be no distortion in the rectified image), +you can save your calibration results with "Save" button. +The output will be named as `_info.yaml`. +So, you use this file with your camera driver directly. + +Here is the video +for demonstrating the intrinsic camera calibration process on tutorial_vehicle: +![type:video](https://youtube.com/embed/jN77AdGFrGU) diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-camera-calibration.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-camera-calibration.md index e506d995c13..32b287b6cbd 100644 --- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-camera-calibration.md +++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-camera-calibration.md @@ -56,7 +56,7 @@ The created `interactive.launch.xml` and `interactive_sensor_kit.launch.xml` are [aip_xx1](https://github.com/tier4/CalibrationTools/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_xx1) provided from TIER IV. Then we will continue with adding vehicle_id and sensor model names to the `mapping_based.launch.xml`. -(Optionally, values are not important. These parameters Overrode from launch argument) Then, +(Optionally, values are not important. These parameters will be overridden by launch arguments) Then, we will add camera_name for calibrating camera (can be one of the camera0, camera1, camera_front etc. as launch argument) and `use_concatenated_pointcloud` argument. @@ -123,7 +123,7 @@ The final version of the file (interactive.launch.xml) for tutorial_vehicle shou After the completing of interactive.launch.xml file, we will be ready to implement interactive_sensor_kit.launch.xml for the own sensor model. -Optionally, (don't forget, these parameters overrode by ROS 2 launch arguments.) +Optionally, (don't forget, these parameters will be overridden by launch arguments.) you can modify sensor_kit and vehicle_id as `interactive.launch.xml`over this xml snippet. We will set parent_frame for calibration as `sensor_kit_base_link``: diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md index 1826e9fde65..1e7aacc843f 100644 --- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md +++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md @@ -7,11 +7,6 @@ we will explain lidar-lidar calibration over [mapping-based lidar-lidar calibrat CalibrationTools. Also, [map-based lidar-lidar calibration method](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_map_based.md) included in the TIER IV's tools. -[LL-Calib](https://github.com/autocore-ai/calibration_tools/tree/main/lidar-lidar-calib) on GitHub, -provided by [AutoCore](https://autocore.ai/), -is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration. -It's based on local mapping and "GICP" method to derive the relation between main and sub lidar. -If you want more details about these methods such as usage, troubleshooting etc. please check the links above. !!! warning @@ -50,7 +45,7 @@ The created `mapping_based.launch.xml` and `mapping_based_sensor_kit.launch.xml` provided from TIER IV. Then we will continue with adding vehicle_id and sensor model names to the `mapping_based.launch.xml`. -(Optionally, values are not important. These parameters Overrode from launch argument) +(Optionally, values are not important. These parameters will be overridden by launch arguments) ```diff + @@ -204,7 +199,7 @@ calibration_lidar_base_frames and calibration_lidar_frames for calibrator. livox_front_right]"/> ``` -??? note "i.e., At the tutorial_vehicle it should be like this snippet." +??? note "i.e., At the tutorial_vehicle it should be like this snippet" ```xml +