diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/generic-calibration.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/generic-calibration.md
index 31c6d1a6c37..06f22daa8ee 100644
--- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/generic-calibration.md
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/generic-calibration.md
@@ -25,9 +25,9 @@ So our tutorial_vehicle's recorded topics should like this:
End: Sep 6 2023 13:46:43.914 (1693997203.914)
Messages: 8504
Topic information: Topic: /sensing/lidar/top/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1691 | Serialization Format: cdr
- Topic: /sensing/lidar/front/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1691 | Serialization Format: cdr
- Topic: /sensing/camera/camera0/image_rect | Type: sensor_msgs/msg/Image | Count: 2561 | Serialization Format: cdr
- Topic: /sensing/camera/camera0/camera_info | Type: sensor_msgs/msg/CameraInfo | Count: 2561 | Serialization Format: cdr
+ Topic: /sensing/lidar/front/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1691 | Serialization Format: cdr
+ Topic: /sensing/camera/camera0/image_rect | Type: sensor_msgs/msg/Image | Count: 2561 | Serialization Format: cdr
+ Topic: /sensing/camera/camera0/camera_info | Type: sensor_msgs/msg/CameraInfo | Count: 2561 | Serialization Format: cdr
```
## Extrinsic Manual Based Calibration
@@ -59,7 +59,7 @@ Let's start with adding vehicle_id and sensor model names. (Optionally, values a
+
```
-??? note "i.e. tutorial_vehicle"
+??? note "i.e. vehicle_id and sensor_model definition on tutorial_vehicle (manual.launch.xml)"
```xml
@@ -131,7 +131,7 @@ Optionally, you can modify sensor_model and vehicle_id over this xml snippet as
+
```
-??? note "i.e. vehicle_id and sensor_model definition on tutorial_vehicle"
+??? note "i.e. vehicle_id and sensor_model definition on tutorial_vehicle (manual_sensor_kit.launch.xml)"
```xml
+
@@ -303,13 +303,13 @@ colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release --package
So, we are ready to launch and use manual calibrator.
```bash
-ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=manual sensor_model:= vehicle_model:= vehicle_id:=
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=manual sensor_model:= vehicle_model:= vehicle_id:=
```
For tutorial vehicle:
```bash
-ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=manual sensor_model:=tutorial_vehicle_sensor_kit vehicle_model:=tutorial_vehicle vehicle_id:=tutorial_vehicle
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=manual sensor_model:=tutorial_vehicle_sensor_kit vehicle_model:=tutorial_vehicle vehicle_id:=tutorial_vehicle
```
Then play ROS 2 bag file:
@@ -320,7 +320,7 @@ ros2 bag play --clock -l -r 0.2 \
```
You will show to a manual rqt_reconfigure window,
-we will update calibrations by hand according to the rviz2 results.
+we will update calibrations by hand according to the rviz2 results of sensors.
- Press `Refresh` button then press `Expand All` button. The frames on tutorial_vehicle should like this:
@@ -332,5 +332,5 @@ we will update calibrations by hand according to the rviz2 results.
and camera-lidar calibration. At this point, there is hard to calibrate two sensors with exactly same frame, so you should find
approximately (it not must be perfect) calibration pairs between sensors.
-Here is the video for demonstrating a manual calibration process on tutorial vehicle:
+Here is the video for demonstrating a manual calibration process on tutorial_vehicle:
![type:video](https://youtube.com/embed/axHILP0PiaQ)
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/mapping-based-calibration-result.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/mapping-based-calibration-result.png
new file mode 100644
index 00000000000..cebb427d98e
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/mapping-based-calibration-result.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/mapping-based-calibration.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/mapping-based-calibration.png
new file mode 100644
index 00000000000..97b0c8eba5a
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/images/mapping-based-calibration.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md
index 2568da212d8..531ee468d1b 100644
--- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-description/calibrating-sensors/lidar-lidar-calibration.md
@@ -18,6 +18,21 @@ If you want more details about these methods such as usage, troubleshooting etc.
Please get initial calibration results from [Generic Calibration](./generic-calibration.md) section, it is important for getting accurate results from this tool.
We will use initial calibration parameters that we calculated on previous step on this tutorial.
+??? note "ROS 2 Bag example of our calibration process for tutorial_vehicle"
+
+ ```sh
+
+ Files: rosbag2_2023_09_05-11_23_50_0.db3
+ Bag size: 3.8 GiB
+ Storage id: sqlite3
+ Duration: 112.702s
+ Start: Sep 5 2023 11:23:51.105 (1693902231.105)
+ End: Sep 5 2023 11:25:43.808 (1693902343.808)
+ Messages: 2256
+ Topic information: Topic: /sensing/lidar/right/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
+ Topic: /sensing/lidar/top/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
+ ```
+
## Mapping-based lidar-lidar
We start with creating launch file four our vehicle like "Extrinsic Manual Calibration"
@@ -39,11 +54,21 @@ Then we will continue with adding vehicle_id and sensor model names to the `mapp
```diff
+
+
-+
++
+
-+
++
```
+??? note "i.e. vehicle_id and sensor_model definition on tutorial_vehicle (mapping_based.launch.xml)"
+
+ ```xml
+ +
+ +
+ +
+ +
+ +
+ ```
+
After that, we will launch our sensor_kit for mapping-based lidar-lidar calibration,
so we must add these lines on manual.launch.xml:
@@ -60,7 +85,7 @@ so we must add these lines on manual.launch.xml:
The final version of the file (mapping_based.launch.xml) for tutorial_vehicle should be like this:
-??? note "Sample manual.launch.xml file for tutorial vehicle"
+??? note "Sample mapping_based.launch.xml file for tutorial vehicle"
```xml
@@ -91,8 +116,8 @@ which included at the end of the page)
```diff
+
+
-+
-+
++
++
+
+
+
@@ -119,10 +144,14 @@ We will add sensor kit frames for each lidar (except mapping lidar),
we have one lidar for pairing to the main lidar sensor for tutorial vehicle, so it should be like:
```diff
-+
+ +
```
-??? note "i.e., If you have three lidars (one main for mapping, two others)"
+??? note "i.e., If you have one lidar main for mapping, three lidar for calibration"
```xml
+
```
+??? note "i.e., For tutorial_vehicle (one lidar main for mapping, one lidar for calibration)"
+
+ ```xml
+ +
+ ```
+
We will add lidar_calibration_service_names,
calibration_lidar_base_frames and calibration_lidar_frames for calibrator.
-At the tutorial_vehicle it should be like this snippet:
```diff
+
-
-+
-+
++
++
++
```
-??? note "i.e., If you have three lidars (one main for mapping, two others)"
+??? note "i.e., If you have three lidars (one main for mapping, two others) for [aip_x1](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_x1/mapping_based_sensor_kit.launch.xml)"
```xml
```
+
+??? note "i.e., At the tutorial_vehicle it should be like this snippet."
+
+ ```xml
+ +
+
+ +
+ +
+ ```
+
+After that, we will add the sensor topics and sensor frames in order to do that,
+we will continue filling the `mapping_based_sensor_kit.launch.xml` with:
+
+```diff
++
++
++
++
++
++
++
++
+```
+
+??? note "i.e., If you have three lidars (one main for mapping, two others) for [aip_x1](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_x1/mapping_based_sensor_kit.launch.xml)"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+ ```
+
+??? note "At the tutorial_vehicle it should be like this snippet."
+
+ ```xml
+
+
+
+
+
+
+
+
+ ```
+
+Then we will add the extrinsic_mapping_based_calibrator launch file with these arguments to mapping_based_sensor_kit.launch.xml:
+
+??? note "extrinsic_mapping_based_calibrator launch example"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+The mapping_based_sensor_kit.launch.xml launch file for tutorial_vehicle should be this:
+
+??? note "i.e. mapping_based_sensor_kit.launch.xml for tutorial_vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+After completing mapping_based.launch.xml and mapping_based_sensor_kit.launch.xml launch files for own sensor kit;
+now we are ready to calibrate our lidars.
+First of all, we need to build extrinsic_calibration_manager package:
+
+```bash
+colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release --packages-select extrinsic_calibration_manager
+```
+
+So, we are ready to launch and use manual calibrator.
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=mapping_based sensor_model:= vehicle_model:= vehicle_id:=
+```
+
+For tutorial vehicle:
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=mapping_based sensor_model:=tutorial_vehicle_sensor_kit vehicle_model:=tutorial_vehicle vehicle_id:=tutorial_vehicle
+```
+
+You will show the rviz2 screen with several configurations,
+you need
+to update it with your sensor information topics like the video,
+which included an end of the document.
+Also, you can save the rviz2 config on rviz directory,
+so you can use it later with modifying `mapping_based.launch.xml`.
+
+```diff
+extrinsic_mapping_based_calibrator/
+ └─ rviz/
++ └─ tutorial_vehicle_sensor_kit.rviz
+```
+
+Then play ROS 2 bag file:
+
+```bash
+ros2 bag play --clock -l -r 0.2 \
+--remap /tf:=/null/tf /tf_static:=/null/tf_static # if tf is recorded
+```
+
+The calibration step consists of two phases: mapping and calibration.
+At the bag starts playing, then mapping starts as well as the rviz2 screenshot below.
+
+![mapping-based-calibration](images/mapping-based-calibration.png)
+
+So, red arrow markers indicate poses during mapping,
+green arrow markers are special poses taken uniformly,
+and white points indicate the constructed map.
+
+Mapping halts either upon reaching a predefined data threshold
+or can be prematurely concluded by invoking this service:
+
+```bash
+ros2 service call /NAMESPACE/stop_mapping std_srvs/srv/Empty {}
+```
+
+After the mapping phase of calibration is completed, then the calibration process will start.
+After the calibration is completed, then you should rviz2 screen like the image below:
+
+![mapping-based-calibration-result](images/mapping-based-calibration-result.png)
+
+The red points indicate pointcloud that initial calibration results of [previous section](./generic-calibration.md).
+The green points indicate aligned point (calibration result).
+The calibration results will be saved automatically on your
+`dst_yaml` ($HOME/sensor_kit_calibration.yaml) at this tutorial.
+
+Here is the video for demonstrating a manual calibration process on tutorial_vehicle:
+![type:video](https://youtube.com/embed/--WBNP76GoE)