diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/.pages b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/.pages
index 35fd5a113be..b626b1270c9 100644
--- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/.pages
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/.pages
@@ -1,2 +1,8 @@
nav:
- index.md
+ - Starting with TIER IV's CalibrationTools: calibration-tools
+ - Extrinsic manual calibration: extrinsic-manual-calibration
+ - Lidar-lidar calibration: lidar-lidar-calibration
+ - Ground plane-lidar calibration: ground-lidar-calibration
+ - Intrinsic camera calibration: intrinsic-camera-calibration
+ - Lidar-camera calibration: lidar-camera-calibration
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/calibration-tools/images/tutorial_vehicle_sensor_kit_base_link.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/calibration-tools/images/tutorial_vehicle_sensor_kit_base_link.png
new file mode 100644
index 00000000000..943851c694c
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/calibration-tools/images/tutorial_vehicle_sensor_kit_base_link.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/calibration-tools/index.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/calibration-tools/index.md
new file mode 100644
index 00000000000..cc54d7dc288
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/calibration-tools/index.md
@@ -0,0 +1,84 @@
+# Starting with TIER IV's CalibrationTools
+
+## Overview
+
+Autoware expects to have multiple sensors attached to the vehicle as input to perception,
+localization,
+and planning stack.
+These sensors must be calibrated correctly,
+and their positions must be defined at `sensor_kit_description` and `individual_params` packages.
+In this tutorial,
+we will use TIER IV's [CalibrationTools](https://github.com/tier4/CalibrationTools) repository for the calibration.
+
+## Setting of sensor_kit_base_link position with respect to the base_link
+
+In previous section (creating the vehicle and sensor model),
+we mentioned about `sensors_calibration.yaml`.
+This file stores `sensor_kit_base_link` (child frame) position and orientation with respect to the
+`base_link` (parent frame).
+We need to update this relative position
+(all values were initially set equal to zero when file is created)
+with using CAD data of our vehicle.
+
+
+
+So, our `sensors_calibration.yaml` file for our tutorial_vehicle should be like this:
+
+```yaml
+base_link:
+ sensor_kit_base_link:
+ x: 1.600000 # meter
+ y: 0.0
+ z: 1.421595 # 1.151595m + 0.270m
+ roll: 0.0
+ pitch: 0.0
+ yaw: 0.0
+```
+
+You need to update this transformation value with respect to the `sensor_kit_base_link` frame.
+You can also use CAD values for GNSS/INS and IMU position in `sensor_kit_calibration.yaml` file.
+(Please don't forget
+to update the sensor_kit_calibration.yaml file in both the sensor_kit_launch and individual_params packages)
+
+## Installing TIER IV's CalibrationTools repositories on autoware
+
+After completing previous steps (creating your own autoware,
+creating a vehicle and sensor model etc.)
+we are ready to calibrate sensors which prepared their pipeline in creating the sensor model section.
+
+Firstly, we will clone CalibrationTools repositories in own autoware.
+
+```bash
+cd # for example: cd autoware.tutorial_vehicle
+wget https://raw.githubusercontent.com/tier4/CalibrationTools/tier4/universe/calibration_tools.repos
+vcs import src < calibration_tools.repos
+rosdep install -y --from-paths src --ignore-src --rosdistro $ROS_DISTRO
+```
+
+Then build the all packages
+after the all necessary changes are made on sensor model and vehicle model.
+
+```bash
+colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release
+```
+
+## Usage of CalibrationTools
+
+The CalibrationTools repository has several packages
+for calibrating different sensor pairs such as lidar-lidar,
+camera-lidar, ground-lidar etc. In order to calibrate our sensors,
+we will modify `extrinsic_calibration_package` for our sensor kit.
+
+For tutorial_vehicle,
+completed launch files when created following tutorial sections can be found [here](https://github.com/leo-drive/calibration_tools_tutorial_vehicle/tree/tutorial_vehicle/sensor/extrinsic_calibration_manager/launch/tutorial_vehicle_sensor_kit).
+
+- [Manual Calibration](../extrinsic-manual-calibration)
+- [Lidar-Lidar Calibration](../lidar-lidar-calibration)
+ - [Ground Plane-Lidar Calibration](../ground-lidar-calibration)
+- [Intrinsic Camera Calibration](../intrinsic-camera-calibration)
+- [Lidar-Camera Calibration](../lidar-camera-calibration)
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/extrinsic-manual-calibration/images/manual-calibration.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/extrinsic-manual-calibration/images/manual-calibration.png
new file mode 100644
index 00000000000..99265489573
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/extrinsic-manual-calibration/images/manual-calibration.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/extrinsic-manual-calibration/index.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/extrinsic-manual-calibration/index.md
new file mode 100644
index 00000000000..160ba6de5c7
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/extrinsic-manual-calibration/index.md
@@ -0,0 +1,331 @@
+# Manual calibration for all sensors
+
+## Overview
+
+In this section, we will use [Extrinsic Manual Calibration](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_manual.md)
+for extrinsic calibration of our sensors.
+After this process, we won't get accurate calibration results for final use,
+but we will have an initial calibration for other tools.
+For example, in the lidar-lidar or camera-lidar calibration phase,
+we will need initial calibration for getting accurate and successful calibration results.
+
+We need a sample bag file for the calibration process
+which includes raw lidar topics and camera topics.
+The following shows an example of a bag file used for calibration:
+
+??? note "ROS 2 Bag example of our calibration process"
+
+ ```sh
+
+ Files: rosbag2_2023_09_06-13_43_54_0.db3
+ Bag size: 18.3 GiB
+ Storage id: sqlite3
+ Duration: 169.12s
+ Start: Sep 6 2023 13:43:54.902 (1693997034.902)
+ End: Sep 6 2023 13:46:43.914 (1693997203.914)
+ Messages: 8504
+ Topic information: Topic: /sensing/lidar/top/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1691 | Serialization Format: cdr
+ Topic: /sensing/lidar/front/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1691 | Serialization Format: cdr
+ Topic: /sensing/camera/camera0/image_rect | Type: sensor_msgs/msg/Image | Count: 2561 | Serialization Format: cdr
+ Topic: /sensing/camera/camera0/camera_info | Type: sensor_msgs/msg/CameraInfo | Count: 2561 | Serialization Format: cdr
+ ```
+
+## Extrinsic Manual-Based Calibration
+
+### Creating launch files
+
+First of all, we will start with creating launch file for `extrinsic_calibration_manager` package:
+
+```bash
+cd /src/autoware/calibration_tools/sensor
+cd extrinsic_calibration_manager/launch
+mkdir # i.e. for our guide, it will ve mkdir tutorial_vehicle_sensor_kit
+cd # i.e. for our guide, it will ve cd tutorial_vehicle_sensor_kit
+touch manual.launch.xml manual_sensor_kit.launch.xml manual_sensors.launch.xml
+```
+
+We will be modifying these `manual.launch.xml`, `manual_sensors.launch.xml` and `manual_sensor_kit.launch.xml` by using TIER IV's sample sensor kit aip_x1.
+So,
+you should copy the contents of these three files from [aip_x1](https://github.com/tier4/CalibrationTools/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_x1) to your created files.
+
+### Modifying launch files according to your sensor kit
+
+So, we can start modifying `manual.launch.xml`,
+please open this file on a text editor which will you prefer (code, gedit etc.).
+
+(Optionally) Let's start with adding vehicle_id and sensor model names:
+(Values are not important. These parameters will be overridden by launch arguments)
+
+```diff
+
+
+
++
++
+-
++
++
+-
++
+```
+
+The final version of the file (manual.launch.xml) for tutorial_vehicle should be like this:
+
+??? note "Sample manual.launch.xml file for tutorial vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+After the completing of manual.launch.xml file,
+we will be ready to implement manual_sensor_kit.launch.xml for the own sensor model's [sensor_kit_calibration.yaml](https://github.com/autowarefoundation/sample_sensor_kit_launch/blob/main/sample_sensor_kit_description/config/sensor_kit_calibration.yaml):
+
+Optionally, you can modify sensor_model and vehicle_id over this xml snippet as well:
+
+```diff
+...
+
+
+
++
++
+-
++
++
+-
++
+...
+```
+
+Then, we will add all our sensor frames on extrinsic_calibration_manager as child frames:
+
+```diff
+
+-
+-
+-
+-
++
++
++
++
++
+```
+
+For tutorial_vehicle there are four sensors (two lidar, one camera, one gnss/ins), so it will be like this:
+
+??? note "i.e extrinsic_calibration_manager child_frames for tutorial_vehicle"
+
+ ```xml
+ +
+ +
+ +
+ +
+ +
+ +
+ ```
+
+Lastly, we will launch a manual calibrator each frame for our sensors,
+please update namespace (ns) and child_frame argument on calibrator.launch.xml launch file argument:
+
+```diff
+-
+-
+-
+-
+-
++
++
++
++
++
++
++
++ ...
++ ...
++ ...
++ ...
++ ...
++
+```
+
+??? note "i.e., calibrator.launch.xml for each tutorial_vehicle's sensor kit"
+
+ ```xml
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ ```
+
+The final version of the manual_sensor_kit.launch.xml for tutorial_vehicle should be like this:
+
+??? note "Sample [`manual_sensor_kit.launch.xml`](https://github.com/leo-drive/tutorial_vehicle_calibration_tools/blob/tutorial_vehicle/sensor/extrinsic_calibration_manager/launch/tutorial_vehicle_sensor_kit/manual_sensor_kit.launch.xml) for tutorial_vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+You can update `manual_sensors.launch.xml` file according to your modified [sensors_calibration.yaml](https://github.com/autowarefoundation/sample_sensor_kit_launch/blob/main/sample_sensor_kit_description/config/sensors_calibration.yaml) file.
+Since we will not be calibrating the sensor directly with respect to the base_link in tutorial_vehicle,
+we will not change this file.
+
+### Calibrating sensors with extrinsic manual calibrator
+
+After the completion of manual.launch.xml and manual_sensor_kit.launch xml file for extrinsic_calibration_manager package,
+we need to build package:
+
+```bash
+colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release --packages-select extrinsic_calibration_manager
+```
+
+So, we are ready to launch and use manual calibrator:
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=manual sensor_model:= vehicle_model:= vehicle_id:=
+```
+
+For tutorial vehicle:
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=manual sensor_model:=tutorial_vehicle_sensor_kit vehicle_model:=tutorial_vehicle vehicle_id:=tutorial_vehicle
+```
+
+Then play ROS 2 bag file:
+
+```bash
+ros2 bag play --clock -l -r 0.2 \
+--remap /tf:=/null/tf /tf_static:=/null/tf_static # if tf is recorded
+```
+
+You will show to a manual rqt_reconfigure window,
+we will update calibrations by hand according to the rviz2 results of sensors.
+
+- Press `Refresh` button then press `Expand All` button. The frames on tutorial_vehicle should like this:
+
+![forking-autoware_repository.png](images/manual-calibration.png)
+
+- Please write the target frame name in Filter area (i.e., front, helios etc.) and select tunable_static_tf_broadcaster_node, then you can adjust `tf_x, tf_y, tf_z, tf_roll, tf_pitch and tf_yaw` values over RQT panel.
+- If manual adjusting is finished, you can save your calibration results via this command:
+
+```bash
+ros2 topic pub /done std_msgs/Bool "data: true"
+```
+
+- Then you can check the output file in $HOME/\*.yaml.
+
+!!! warning
+
+ The initial calibration process can be important before the using other calibrations. We will look into the lidar-lidar calibration
+ and camera-lidar calibration. At this point, there is hard to calibrate two sensors with exactly same frame, so you should find
+ approximately (it not must be perfect) calibration pairs between sensors.
+
+Here is the video for demonstrating a manual calibration process on tutorial_vehicle:
+![type:video](https://youtube.com/embed/axHILP0PiaQ)
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/images/after-ground-plane.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/images/after-ground-plane.png
new file mode 100644
index 00000000000..ee628accd0d
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/images/after-ground-plane.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/images/before-ground-plane.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/images/before-ground-plane.png
new file mode 100644
index 00000000000..fbd889984a7
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/images/before-ground-plane.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/index.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/index.md
new file mode 100644
index 00000000000..f89a317dfa2
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/ground-lidar-calibration/index.md
@@ -0,0 +1,325 @@
+# Ground-Lidar calibration
+
+## Overview
+
+[Ground-Lidar Calibration](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_ground_plane.md) method operates under the assumption
+that the area surrounding the vehicle can be represented as a flat surface.
+So, you must find as wide and flat a surface as possible for ROS 2 bag recording.
+The method then modifies the calibration transformation
+in a way that aligns the points
+corresponding to the ground within the point cloud with the XY plane of the base_link.
+This means that only the z, roll, and pitch values of the tf undergo calibration,
+while the remaining x, y, and yaw values must be calibrated using other methods,
+such as [manual adjustment](../extrinsic-manual-calibration) or [mapping-based lidar-lidar calibration](../lidar-camera-calibration).
+
+You need to apply this calibration method to each lidar separately,
+so our bag should contain all lidars to be calibrated.
+
+We need a sample bag file for the ground-lidar calibration process
+which includes raw lidar topics.
+
+??? note "ROS 2 Bag example of our ground-based calibration process for tutorial_vehicle"
+
+ ```sh
+
+ Files: rosbag2_2023_09_05-11_23_50_0.db3
+ Bag size: 3.8 GiB
+ Storage id: sqlite3
+ Duration: 112.702s
+ Start: Sep 5 2023 11:23:51.105 (1693902231.105)
+ End: Sep 5 2023 11:25:43.808 (1693902343.808)
+ Messages: 2256
+ Topic information: Topic: /sensing/lidar/front/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
+ Topic: /sensing/lidar/top/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
+ ```
+
+## Ground-lidar calibration
+
+### Creating launch files
+
+We will start with creating launch file four our own vehicle like the previous sections process:
+
+```bash
+cd /src/autoware/calibration_tools/sensor
+cd extrinsic_calibration_manager/launch
+cd # i.e. for our guide, it will ve cd tutorial_vehicle_sensor_kit which is created in manual calibration
+touch ground_plane.launch.xml ground_plane_sensor_kit.launch.xml
+```
+
+We will be modifying these `ground_plane.launch.xml` and `ground_plane_sensor_kit.launch.xml` by using TIER IV's sample sensor kit aip_x1.
+So,
+you should copy the contents of these two files from [aip_x1](https://github.com/tier4/CalibrationTools/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_x1) to your created files.
+
+### Modifying launch files according to your sensor kit
+
+(Optionally) Let's start with adding vehicle_id and sensor model names:
+(Values are not important. These parameters will be overridden by launch arguments)
+
+```diff
+
+
+
++
++
+-
++
++
+-
++
+```
+
+The final version of the file (ground_plane.launch.xml) for tutorial_vehicle should be like this:
+
+??? note "Sample ground_plane.launch.xml file for tutorial vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+After the completing of ground_plane.launch.xml file,
+we will be ready to implement ground_plane_sensor_kit.launch.xml for the own sensor model.
+
+Optionally, (don't forget, these parameters will be overridden by launch arguments.)
+you can modify sensor_kit and vehicle_id as `ground_plane.launch.xml`over this xml snippet:
+(You can change rviz_profile path after the saving rviz config as video
+which included at the end of the page)
+
+```diff
++
++
+-
++
+-
++
+
+
+
+```
+
+If you save rviz config file before for the ground-lidar calibration process:
+
+```diff
+-
++
+```
+
+Then, we will add all our sensor frames on extrinsic_calibration_manager as child frames:
+
+```diff
+
+-
+-
+-
+-
++
++
++
++
++
+```
+
+For tutorial_vehicle there are two lidar sensors (rs_helios_top and rs_bpearl_front),
+so it will be like this:
+
+??? note "i.e extrinsic_calibration_manager child_frames for tutorial_vehicle"
+
+ ```xml
+ +
+ +
+ +
+ +
+ +
+ +
+ ```
+
+After that we will add our lidar sensor configurations on ground-based calibrator,
+to do that we will add these lines our `ground_plane_sensor_kit.launch.xml` file:
+
+```diff
+-
+-
+-
+-
+-
+-
+-
+-
+-
++
++
++
++
++
++
++
++
++
++ ...
++ ...
++ ...
++ ...
++ ...
++
+```
+
+??? note "i.e., launch calibrator.launch.xml for each tutorial_vehicle's lidar"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+The ground_plane_sensor_kit.launch.xml launch file for tutorial_vehicle should be this:
+
+??? note "Sample [`ground_plane_sensor_kit.launch.xml`](https://github.com/leo-drive/tutorial_vehicle_calibration_tools/blob/tutorial_vehicle/sensor/extrinsic_calibration_manager/launch/tutorial_vehicle_sensor_kit/ground_plane_sensor_kit.launch.xml) for tutorial_vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+### Ground plane-lidar calibration process with extrinsic ground-plane calibrator
+
+After completing mapping_based.launch.xml and mapping_based_sensor_kit.launch.xml launch files for own sensor kit;
+now we are ready to calibrate our lidars.
+First of all, we need to build extrinsic_calibration_manager package:
+
+```bash
+colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release --packages-select extrinsic_calibration_manager
+```
+
+So, we are ready to launch and use ground-based lidar-ground calibrator.
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=ground_plane sensor_model:= vehicle_model:= vehicle_id:=
+```
+
+For tutorial vehicle:
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=ground_plane sensor_model:=tutorial_vehicle_sensor_kit vehicle_model:=tutorial_vehicle vehicle_id:=tutorial_vehicle
+```
+
+You will show the rviz2 screen with several configurations,
+you need
+to update it with your sensor information topics, sensor_frames and pointcloud_inlier_topics like the video,
+which included an end of the document.
+Also, you can save the rviz2 config on rviz directory,
+so you can use it later with modifying `mapping_based_sensor_kit.launch.xml`.
+
+```diff
+extrinsic_mapping_based_calibrator/
+ └─ rviz/
++ └─ tutorial_vehicle_sensor_kit.rviz
+```
+
+Then play ROS 2 bag file, the calibration process will be started:
+
+```bash
+ros2 bag play --clock -l -r 0.2 \
+--remap /tf:=/null/tf /tf_static:=/null/tf_static # if tf is recorded
+```
+
+Since the calibration process is done automatically,
+you can see the sensor_kit_calibration.yaml in your $HOME directory after the calibration process is complete.
+
+| Before Ground Plane - Lidar Calibration | After Ground Plane - Lidar Calibration |
+| :--------------------------------------------------------: | :-------------------------------------------------------------: |
+| ![before-ground-plane.png](images/before-ground-plane.png) | ![images/after-ground-plane.png](images/after-ground-plane.png) |
+
+Here is the video for demonstrating the ground plane - lidar calibration process on tutorial_vehicle:
+![type:video](https://youtube.com/embed/EqaF1fufjUc)
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/index.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/index.md
index 57d0a868011..a1917de804f 100644
--- a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/index.md
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/index.md
@@ -2,22 +2,35 @@
## Overview
-Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack. These sensors must be calibrated correctly and their positions must be defined using either urdf files (as in [sample_sensor_kit](https://github.com/autowarefoundation/sample_sensor_kit_launch/tree/main/sample_sensor_kit_description)) or as tf launch files.
+Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack.
+Autoware uses fusion techniques to combine information from multiple sensors.
+For this to work effectively,
+all sensors must be calibrated properly to align their coordinate systems, and their positions must be defined using either urdf files
+(as in [sample_sensor_kit](https://github.com/autowarefoundation/sample_sensor_kit_launch/tree/main/sample_sensor_kit_description))
+or as tf launch files.
+In this documentation,
+we will explain TIER IV's [CalibrationTools](https://github.com/tier4/CalibrationTools) repository for the calibration process.
+Please look
+at [Starting with TIER IV's CalibrationTools page](./calibration-tools) for installation and usage of this tool.
-## Camera calibration
+If you want to look at other calibration packages and methods, you can check out the following packages.
-### Intrinsic Calibration
+## Other packages you can check out
+
+### Camera calibration
+
+#### Intrinsic Calibration
- Navigation2 provides a [good tutorial for camera internal calibration](https://navigation.ros.org/tutorials/docs/camera_calibration.html).
- [AutoCore](https://autocore.ai/) provides a [light-weight tool](https://github.com/autocore-ai/calibration_tools/tree/main/camera_intrinsic_calib).
-## Lidar-lidar calibration
+### Lidar-lidar calibration
-### Lidar-Lidar Calibration tool from Autocore
+#### Lidar-Lidar Calibration tool from Autocore
[LL-Calib on GitHub](https://github.com/autocore-ai/calibration_tools/tree/main/lidar-lidar-calib), provided by [AutoCore](https://autocore.ai/), is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration. It's based on local mapping and "GICP" method to derive the relation between main and sub lidar. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the above link.
-## Lidar-camera calibration
+### Lidar-camera calibration
Developed by MathWorks, The Lidar Camera Calibrator app enables you to interactively estimate the rigid transformation between a lidar sensor and a camera.
@@ -32,7 +45,7 @@ Developed by [AutoCore](https://autocore.ai/), an easy-to-use lightweight toolki
-## Lidar-IMU calibration
+### Lidar-IMU calibration
Developed by [APRIL Lab](https://github.com/APRIL-ZJU) at Zhejiang University in China, the LI-Calib calibration tool is a toolkit for calibrating the 6DoF rigid transformation and the time offset between a 3D LiDAR and an IMU, based on continuous-time batch optimization.
IMU-based cost and LiDAR point-to-surfel (surfel = surface element) distance are minimized jointly, which renders the calibration problem well-constrained in general scenarios.
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/chess-board-7x7.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/chess-board-7x7.png
new file mode 100644
index 00000000000..fd28cf70edb
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/chess-board-7x7.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-data-collecting.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-data-collecting.png
new file mode 100644
index 00000000000..e4772a312ce
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-data-collecting.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-initial-configuration.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-initial-configuration.png
new file mode 100644
index 00000000000..0ef3507f98c
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-initial-configuration.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-topic-information.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-topic-information.png
new file mode 100644
index 00000000000..ead4182953a
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/images/intrinsic-topic-information.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/index.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/index.md
new file mode 100644
index 00000000000..bbb73962f9b
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/intrinsic-camera-calibration/index.md
@@ -0,0 +1,119 @@
+# Intrinsic camera calibration
+
+## Overview
+
+Intrinsic camera calibration is the process
+of determining the internal parameters of a camera
+which will be used when projecting 3D information into images.
+These parameters include focal length, optical center, and lens distortion coefficients.
+In order to perform camera Intrinsic calibration,
+we will use TIER IV's [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) tool.
+First of all, we need a calibration board which can be dot, chess or apriltag grid board.
+In this tutorial, we will use this 7x7 chess board consisting of 7 cm squares:
+
+
+
+Here are some calibration board samples from [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) page:
+
+- Chess boards ([6x8 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/checkerboard_8x6.pdf))
+- Circle dot boards ([6x8 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/circle_8x6.pdf))
+- Apriltag grid board ([3x4 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/apriltag_grid_3x4.pdf))
+
+If you want to use bag file for a calibration process,
+the bag file must include `image_raw` topic of your camera sensor,
+but you can perform calibration with real time.
+(recommended)
+
+??? note "ROS 2 Bag example for intrinsic camera calibration process"
+
+ ```sh
+
+ Files: rosbag2_2023_09_18-16_19_08_0.db3
+ Bag size: 12.0 GiB
+ Storage id: sqlite3
+ Duration: 135.968s
+ Start: Sep 18 2023 16:19:08.966 (1695043148.966)
+ End: Sep 18 2023 16:21:24.934 (1695043284.934)
+ Messages: 4122
+ Topic information: Topic: /sensing/camera/camera0/image_raw | Type: sensor_msgs/msg/Image | Count: 2061 | Serialization Format: cdr
+
+ ```
+
+## Intrinsic camera calibration
+
+Unlike other calibration packages in our tutorials,
+this package does not need to create an initialization file.
+So we can start with launching intrinsic calibrator package.
+
+```bash
+cd
+source install/setup.bash
+```
+
+After that, we will launch intrinsic calibrator:
+
+```bash
+ros2 launch intrinsic_camera_calibrator calibrator.launch.xml
+```
+
+Then, initial configuration and camera intrinsic calibration panels will show up.
+We set initial configurations for our calibration.
+
+
+
+We set our image source (it can be "ROS topic", "ROS bag" or "Image files")
+from the source options section.
+We will calibrate our camera with "ROS topic" source.
+After selecting an image source from this panel, we need to configure "Board options" as well.
+The calibration board can be `Chess board, Dot board or Apriltag`.
+Also, we need to select board parameters,
+to do that, click the "Board parameters" button and set `row, column, and cell` size.
+
+After the setting of image source and board parameters, we are ready for the calibration process.
+Please click the start button, you will see `Topic configuration` panel.
+Please select the appropriate camera raw topic for the calibration process.
+
+
+
+Then you are ready to calibration.
+Please collect data with different X-Y axis, sizes and skews.
+You can see your collected data statistics with the clicking view data collection statistics.
+For more information,
+please refer to [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) page.
+
+
+
+After the data collection is completed,
+you can click the "Calibrate" button to perform the calibration process.
+After the calibration is completed,
+you will see data visualization for the calibration result statistics.
+You can observe your calibration results with changing "Image view type "Source unrectified"
+to "Source rectified".
+If your calibration is successful (there should be no distortion in the rectified image),
+you can save your calibration results with "Save" button.
+The output will be named as `_info.yaml`.
+So, you use this file with your camera driver directly.
+
+Here is the video
+for demonstrating the intrinsic camera calibration process on tutorial_vehicle:
+![type:video](https://youtube.com/embed/jN77AdGFrGU)
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibration-points.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibration-points.png
new file mode 100644
index 00000000000..b0014374c7a
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibration-points.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibrator-results.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibrator-results.png
new file mode 100644
index 00000000000..d23e1bcc593
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibrator-results.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibrator.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibrator.png
new file mode 100644
index 00000000000..82a214575b6
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/images/interactive-calibrator.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/index.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/index.md
new file mode 100644
index 00000000000..9b78aaad1c9
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-camera-calibration/index.md
@@ -0,0 +1,348 @@
+# Lidar-Camera calibration
+
+## Overview
+
+Lidar-camera calibration is a crucial process in the field of autonomous driving and robotics,
+where both lidar sensors and cameras are used for perception.
+The goal of calibration is
+to accurately align the data from these different sensors
+in order to create a comprehensive and coherent representation of the environment
+by projecting lidar point onto camera image.
+At this tutorial,
+we will explain [TIER IV's interactive camera calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_interactive.md).
+Also, If you have aruco marker boards for calibration,
+another [Lidar-Camera calibration method](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_tag_based.md) is included in TIER IV's CalibrationTools repository.
+
+!!! warning
+
+ You need to apply [intrinsic calibration](../intrinsic-camera-calibration) before starting lidar-camera extrinsic calibration process. Also,
+ please obtain the initial calibration results from the [Manual Calibration](../extrinsic-manual-calibration) section.
+ This is crucial for obtaining accurate results from this tool.
+ We will utilize the initial calibration parameters that were calculated
+ in the previous step of this tutorial.
+ To apply these initial values in the calibration tools,
+ please update your sensor calibration files within the individual parameter package.
+
+Your bag file must include calibration lidar topic and camera topics.
+Camera topics can be compressed or raw topics,
+but remember
+we will update interactive calibrator launch argument `use_compressed` according to the topic type.
+
+??? note "ROS 2 Bag example of our calibration process (there is only one camera mounted) If you have multiple cameras, please add camera_info and image topics as well."
+
+ ```sh
+
+ Files: rosbag2_2023_09_12-13_57_03_0.db3
+ Bag size: 5.8 GiB
+ Storage id: sqlite3
+ Duration: 51.419s
+ Start: Sep 12 2023 13:57:03.691 (1694516223.691)
+ End: Sep 12 2023 13:57:55.110 (1694516275.110)
+ Messages: 2590
+ Topic information: Topic: /sensing/lidar/top/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 515 | Serialization Format: cdr
+ Topic: /sensing/camera/camera0/image_raw | Type: sensor_msgs/msg/Image | Count: 780 | Serialization Format: cdr
+ Topic: /sensing/camera/camera0/camera_info | Type: sensor_msgs/msg/CameraInfo | Count: 780 | Serialization Format: cdr
+
+ ```
+
+## Lidar-Camera calibration
+
+### Creating launch files
+
+We start with creating launch file four our vehicle like "Extrinsic Manual Calibration"
+process:
+
+```bash
+cd /src/autoware/calibration_tools/sensor
+cd extrinsic_calibration_manager/launch
+cd # i.e. for our guide, it will ve cd tutorial_vehicle_sensor_kit which is created in manual calibration
+touch interactive.launch.xml interactive_sensor_kit.launch.xml
+```
+
+### Modifying launch files according to your sensor kit
+
+We will be modifying these `interactive.launch.xml` and `interactive_sensor_kit.launch.xml` by using TIER IV's sample sensor kit aip_xx1.
+So,
+you should copy the contents of these two files from [aip_xx1](https://github.com/tier4/CalibrationTools/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_xx1) to your created files.
+
+Then we will continue with adding vehicle_id and sensor model names to the `interactive.launch.xml`.
+(Optionally, values are not important. These parameters will be overridden by launch arguments)
+
+```diff
+
+
+
+
+
+-
++
++
+-
++
+```
+
+If you want to use concatenated pointcloud as an input cloud
+(the calibration process will initiate the logging simulator,
+resulting in the construction of the lidar pipeline and the appearance of the concatenated point cloud),
+you must set `use_concatenated_pointcloud` value as `true`.
+
+```diff
+-
++
+```
+
+The final version of the file (interactive.launch.xml) for tutorial_vehicle should be like this:
+
+??? note "Sample interactive.launch.xml file for tutorial vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+After the completing of interactive.launch.xml file,
+we will be ready to implement interactive_sensor_kit.launch.xml for the own sensor model.
+
+Optionally, (don't forget, these parameters will be overridden by launch arguments)
+you can modify sensor_kit and vehicle_id as `interactive.launch.xml`over this xml snippet.
+We will set parent_frame for calibration as `sensor_kit_base_link``:
+
+The default camera input topic of interactive calibrator is compressed image.
+If you want to use raw image instead of compressed image,
+you need to update image_topic variable for your camera sensor topic.
+
+```diff
+ ...
+-
++
+ ...
+```
+
+After updating your topic name,
+you need to add the use_compressed parameter (default value is `true`)
+to the interactive_calibrator node with a value of `false`.
+
+```diff
+ ...
+
+
+
+
+
++
+ ...
+```
+
+Then you can customize pointcloud topic for each camera.
+For example, if you want to calibrate camera_1 with left lidar,
+then you should change launch file like this:
+
+```diff
+
+-
++
+
+
+
+
+ ...
+```
+
+The interactive_sensor_kit.launch.xml launch file for tutorial_vehicle should be this:
+
+??? note "i.e. [`interactive_sensor_kit.launch.xml`](https://github.com/leo-drive/tutorial_vehicle_calibration_tools/blob/tutorial_vehicle/sensor/extrinsic_calibration_manager/launch/tutorial_vehicle_sensor_kit/interactive_sensor_kit.launch.xml) for tutorial_vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+### Lidar-camera calibration process with interactive camera-lidar calibrator
+
+After completing interactive.launch.xml and interactive_sensor_kit.launch.xml launch files for own sensor kit;
+now we are ready to calibrate our lidars.
+First of all, we need to build extrinsic_calibration_manager package:
+
+```bash
+colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release --packages-select extrinsic_calibration_manager
+```
+
+So, we are ready to launch and use interactive lidar-camera calibrator.
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=interactive sensor_model:= vehicle_model:= vehicle_id:= camera_name:=
+```
+
+For tutorial vehicle:
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=interactive sensor_model:=tutorial_vehicle_sensor_kit vehicle_model:=tutorial_vehicle vehicle_id:=tutorial_vehicle
+```
+
+Then, we need to play our bag file.
+
+```bash
+ros2 bag play --clock -l -r 0.2 \
+--remap /tf:=/null/tf /tf_static:=/null/tf_static # if tf is recorded
+```
+
+You will be shown a manual interactive calibrator rqt window and Rviz2.
+You must add your lidar sensor point cloud to Rviz2, then we can publish points for the calibrator.
+
+![interactive-calibrator](images/interactive-calibrator.png)
+
+- After that, Let's start by pressing the `Publish Point`
+ button and selecting points on the point cloud that are also included in the projected image.
+ Then,
+ you need to click on the image point that corresponds to the projected lidar point on the image.
+ You will see matched calibration points.
+
+![interactive-calibration-points.png](images/interactive-calibration-points.png)
+
+- The red points indicate selected lidar points and green ones indicate selected image points.
+ You must match the minimum 6 points to perform calibration.
+ If you have a wrong match, you can remove this match by just clicking on them.
+ After selecting points on image and lidar, you are ready to calibrate.
+ If selected point match size is greater than 6, "Calibrate extrinsic" button will be enabled.
+ Click this button and change tf source `Initial /tf` to `Calibrator` to see calibration results.
+
+![interactive-calibrator-results.png](images/interactive-calibrator-results.png)
+
+After the completion of the calibration,
+you need to save your calibration results via "Save calibration" button.
+The saved format is json,
+so you need
+to update calibration params at `sensor_kit_calibration.yaml` on `individual_params` and `sensor_kit_description` packages.
+
+??? note "Sample calibration output"
+
+ ```json
+ {
+ "header": {
+ "stamp": {
+ "sec": 1694776487,
+ "nanosec": 423288443
+ },
+ "frame_id": "sensor_kit_base_link"
+ },
+ "child_frame_id": "camera0/camera_link",
+ "transform": {
+ "translation": {
+ "x": 0.054564283153017916,
+ "y": 0.040947512210503106,
+ "z": -0.071735410952332
+ },
+ "rotation": {
+ "x": -0.49984112274024817,
+ "y": 0.4905405357176159,
+ "z": -0.5086269994990131,
+ "w": 0.5008267267391722
+ }
+ },
+ "roll": -1.5517347113946862,
+ "pitch": -0.01711459479043047,
+ "yaw": -1.5694590141484235
+ }
+ ```
+
+Here is the video
+for demonstrating the lidar-camera calibration process on tutorial_vehicle:
+![type:video](https://youtube.com/embed/1q4nBml9jRA)
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/images/mapping-based-calibration-result.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/images/mapping-based-calibration-result.png
new file mode 100644
index 00000000000..cebb427d98e
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/images/mapping-based-calibration-result.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/images/mapping-based-calibration.png b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/images/mapping-based-calibration.png
new file mode 100644
index 00000000000..97b0c8eba5a
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/images/mapping-based-calibration.png differ
diff --git a/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/index.md b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/index.md
new file mode 100644
index 00000000000..87eb4ee99b4
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/creating-vehicle-and-sensor-model/calibrating-sensors/lidar-lidar-calibration/index.md
@@ -0,0 +1,392 @@
+# Lidar-Lidar calibration
+
+## Overview
+
+In this tutorial,
+we will explain lidar-lidar calibration over [mapping-based lidar-lidar calibration tool](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_mapping_based.md) of TIER IV's
+CalibrationTools.
+
+!!! warning
+
+ Please obtain the initial calibration results from the [Manual Calibration](../extrinsic-manual-calibration) section.
+ This is crucial for obtaining accurate results from this tool.
+ We will utilize the initial calibration parameters that were calculated
+ in the previous step of this tutorial.
+ To apply these initial values in the calibration tools,
+ please update your sensor calibration files within the individual parameter package.
+
+We need a sample bag file for the lidar-lidar calibration process
+which includes raw lidar topics.
+Also, we recommend using an outlier-filtered
+point cloud for mapping because this point cloud
+includes a cropped vehicle point cloud. Therefore,
+vehicle points are not included in the map. When you start
+the bag recording,
+you should not move the vehicle for the first 5 seconds for better mapping performace.
+The following shows an example of a bag file used for this calibration:
+
+??? note "ROS 2 Bag example of our calibration process for tutorial_vehicle"
+
+ ```sh
+
+ Files: rosbag2_2023_09_05-11_23_50_0.db3
+ Bag size: 3.8 GiB
+ Storage id: sqlite3
+ Duration: 112.702s
+ Start: Sep 5 2023 11:23:51.105 (1693902231.105)
+ End: Sep 5 2023 11:25:43.808 (1693902343.808)
+ Messages: 2256
+ Topic information: Topic: /sensing/lidar/front/pointcloud_raw | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
+ Topic: /sensing/lidar/top/outlier_filtered/pointcloud | Type: sensor_msgs/msg/PointCloud2 | Count: 1128 | Serialization Format: cdr
+ ```
+
+## Mapping-based lidar-lidar calibration
+
+### Creating launch files
+
+We start with creating launch file four our vehicle like `Extrinsic Manual Calibration`
+process:
+
+```bash
+cd /src/autoware/calibration_tools/sensor
+cd extrinsic_calibration_manager/launch
+cd # i.e. for our guide, it will ve cd tutorial_vehicle_sensor_kit which is created in manual calibration
+touch mapping_based.launch.xml mapping_based_sensor_kit.launch.xml
+```
+
+We will be modifying these `mapping_based.launch.xml` and `mapping_based_sensor_kit.launch.xml` by using TIER IV's sample sensor kit aip_x1.
+So,
+you should copy the contents of these two files from [aip_x1](https://github.com/tier4/CalibrationTools/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_x1) to your created files.
+
+Then we will continue with adding vehicle_id and sensor model names to the `mapping_based.launch.xml`:
+(Optionally, values are not important. These parameters will be overridden by launch arguments)
+
+```diff
+
+
+
++
++
+-
++
++
+-
++
+```
+
+The final version of the file (mapping_based.launch.xml) for tutorial_vehicle should be like this:
+
+??? note "Sample mapping_based.launch.xml file for tutorial vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+After the completing of mapping_based.launch.xml file,
+we will be ready to implement mapping_based_sensor_kit.launch.xml for the own sensor model.
+
+Optionally,
+you can modify sensor_kit and vehicle_id as `mapping_based.launch.xml`over this xml snippet:
+(You can change rviz_profile path after the saving rviz config as video
+which included at the end of the page)
+
+We will add sensor kit frames for each lidar (except mapping lidar),
+we have one lidar for pairing to the main lidar sensor for tutorial vehicle, so it should be like:
+
+**Note**: The mapping lidar will be used for mapping purposes, but it will not be calibrated.
+We can consider this lidar as the main sensor for our hardware architecture.
+Therefore, other lidars will be calibrated with respect to the mapping lidar (main sensor).
+
+```diff
++
+```
+
+If you save rviz config file before for the lidar-lidar calibration process:
+
+```diff
+-
++
+```
+
+??? note "i.e., If you have one main lidar for mapping, three lidar for calibration"
+
+ ```xml
+ +
+ ```
+
+??? note "i.e., For tutorial_vehicle (one lidar main for mapping, one lidar for calibration)"
+
+ ```xml
+ +
+ ```
+
+We will add lidar_calibration_service_names,
+calibration_lidar_base_frames and calibration_lidar_frames for calibrator:
+
+```diff
+-
+-
+-
++
++
++
++
+```
+
+??? note "i.e., At the tutorial_vehicle it should be like this snippet"
+
+ ```xml
+ +
+
+ +
+ +
+ ```
+
+After that, we will add the sensor topics and sensor frames in order to do that,
+we will continue filling the `mapping_based_sensor_kit.launch.xml` with
+(we recommend
+using the /sensing/lidar/top/outlier_filtered/pointcloud topic as the mapping pointcloud
+because the vehicle cloud is cropped at this topic by pointcloud preprocessing):
+
+```diff
+
+-
+-
++
++
+
+
+-
++
+```
+
+??? note "At the tutorial_vehicle it should be like this snippet."
+
+ ```xml
+
+
+
+
+
+
+
+
+ ```
+
+The mapping_based_sensor_kit.launch.xml launch file for tutorial_vehicle should be this:
+
+??? note "i.e. [`mapping_based_sensor_kit.launch.xml`](https://github.com/leo-drive/tutorial_vehicle_calibration_tools/blob/tutorial_vehicle/sensor/extrinsic_calibration_manager/launch/tutorial_vehicle_sensor_kit/mapping_based_sensor_kit.launch.xml) for tutorial_vehicle"
+
+ ```xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ ```
+
+### Lidar-Lidar calibration process with interactive mapping-based calibrator
+
+After completing mapping_based.launch.xml and mapping_based_sensor_kit.launch.xml launch files for own sensor kit;
+now we are ready to calibrate our lidars.
+First of all, we need to build extrinsic_calibration_manager package:
+
+```bash
+colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release --packages-select extrinsic_calibration_manager
+```
+
+So, we are ready to launch and use mapping-based lidar-lidar calibrator:
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=mapping_based sensor_model:= vehicle_model:= vehicle_id:=
+```
+
+For tutorial vehicle:
+
+```bash
+ros2 launch extrinsic_calibration_manager calibration.launch.xml mode:=mapping_based sensor_model:=tutorial_vehicle_sensor_kit vehicle_model:=tutorial_vehicle vehicle_id:=tutorial_vehicle
+```
+
+You will show the rviz2 screen with several configurations,
+you need
+to update it with your sensor information topics like the video,
+which included an end of the document.
+Also, you can save the rviz2 config on rviz directory,
+so you can use it later with modifying `mapping_based_sensor_kit.launch.xml`.
+
+```diff
+extrinsic_ground_plane_calibrator/
+ └─ rviz/
++ └─ tutorial_vehicle_sensor_kit.rviz
+```
+
+Then play ROS 2 bag file:
+
+```bash
+ros2 bag play --clock -r 0.2 \
+--remap /tf:=/null/tf /tf_static:=/null/tf_static # if tf is recorded
+```
+
+The calibration step consists of two phases: mapping and calibration.
+At the bag starts playing, then mapping starts as well as the rviz2 screenshot below.
+
+![mapping-based-calibration](images/mapping-based-calibration.png)
+
+So, red arrow markers indicate poses during mapping,
+green arrow markers are special poses taken uniformly,
+and white points indicate the constructed map.
+
+Mapping halts either upon reaching a predefined data threshold
+or can be prematurely concluded by invoking this service:
+
+```bash
+ros2 service call /NAMESPACE/stop_mapping std_srvs/srv/Empty {}
+```
+
+After the mapping phase of calibration is completed, then the calibration process will start.
+After the calibration is completed, then you should rviz2 screen like the image below:
+
+![mapping-based-calibration-result](images/mapping-based-calibration-result.png)
+
+The red points indicate pointcloud that initial calibration results of [previous section](../extrinsic-manual-calibration).
+The green points indicate aligned point (calibration result).
+The calibration results will be saved automatically on your
+`dst_yaml` ($HOME/sensor_kit_calibration.yaml) at this tutorial.
+
+Here is the video for demonstrating the mapping-based lidar-lidar calibration process on tutorial_vehicle:
+![type:video](https://youtube.com/embed/--WBNP76GoE)