Skip to content

Commit

Permalink
finalize intrinsic camera calibration
Browse files Browse the repository at this point in the history
Signed-off-by: ismetatabay <[email protected]>
  • Loading branch information
ismetatabay committed Sep 22, 2023
1 parent 79b81dd commit e43c732
Show file tree
Hide file tree
Showing 11 changed files with 190 additions and 16 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,46 @@

## Overview

Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack. These sensors must be calibrated correctly, and their positions must be defined using either urdf files (as in [sample_sensor_kit](https://github.com/autowarefoundation/sample_sensor_kit_launch/tree/main/sample_sensor_kit_description)) or as tf launch files.
Autoware expects to have multiple sensors attached to the vehicle as input to perception,
localization,
and planning stack.
These sensors must be calibrated correctly,
and their positions must be defined at `sensor_kit_description` and `individual_params` packages.
In order to do that, we will use TIER IV' [CalibrationTools](https://github.com/tier4/CalibrationTools) repository.

## Setting of sensor_kit_base_link position according to the base_link

In previous section (creating the vehicle and sensor model),
we mentioned about `sensors_calibration.yaml`.
This file stores `sensor_kit_base_link` (child frame) position and orientation according to the
`base_link` (parent frame).
We need to update this relative position
(all values were initially set equal to zero when file is created)
with using CAD data of our vehicle.

<figure markdown>
![ackermann_link](images/tutorial_vehicle_sensor_kit_base_link.png){ align=center }
<figcaption>
Our tutorial_vehicle base_link to sensor_kit_base_link transformation.
</figcaption>
</figure>

So, our `sensors_calibration.yaml` file for our tutorial_vehicle should be like this:

```yaml
base_link:
sensor_kit_base_link:
x: 1.600000 # meter
y: 0.0
z: 1.421595 # 1.151595m + 0.270m
roll: 0.0
pitch: 0.0
yaw: 0.0
```
You need to update this transformation value according to the `sensor_kit_base_link` frame.
You can also use CAD values for GNSS/INS and IMU position in `sensor_kit_calibration.yaml` file.

## Installing TIER IV's CalibrationTools repositories on autoware

After completing previous steps (creating your own autoware,
Expand Down Expand Up @@ -34,8 +71,48 @@ for calibrating different sensor pairs such as lidar-lidar,
camera-lidar, ground-lidar etc. In order to calibrate our sensors,
we will modify `extrinsic_calibration_package` for our sensor kit.

For tutorial_vehicle,
completed launch files when created following tutorial sections can be found [here](https://github.com/leo-drive/calibration_tools_tutorial_vehicle/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/tutorial_vehicle_sensor_kit).

- [Manual Calibration](./manual-calibration.md)
- [Lidar-Lidar Calibration](./lidar-lidar-calibration.md)
- [Ground Plane-Lidar Calibration](./ground-lidar-calibration.md)
- [Intrinsic Camera Calibration](./intrinsic-camera-calibration.md)
- [Lidar-Camera Calibration](./lidar-camera-calibration.md)

## Other packages you can check out

### Camera calibration

#### Intrinsic Calibration

- Navigation2 provides a [good tutorial for camera internal calibration](https://navigation.ros.org/tutorials/docs/camera_calibration.html).
- [AutoCore](https://autocore.ai/) provides a [light-weight tool](https://github.com/autocore-ai/calibration_tools/tree/main/camera_intrinsic_calib).

### Lidar-lidar calibration

#### Lidar-Lidar Calibration tool from Autocore

[LL-Calib on GitHub](https://github.com/autocore-ai/calibration_tools/tree/main/lidar-lidar-calib), provided by [AutoCore](https://autocore.ai/), is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration. It's based on local mapping and "GICP" method to derive the relation between main and sub lidar. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the above link.

### Lidar-camera calibration

Developed by MathWorks, The Lidar Camera Calibrator app enables you to interactively estimate the rigid transformation between a lidar sensor and a camera.

<https://ww2.mathworks.cn/help/lidar/ug/get-started-lidar-camera-calibrator.html>

SensorsCalibration toolbox v0.1: One more open source method for Lidar-camera calibration.
This is a project for LiDAR to camera calibration,including automatic calibration and manual calibration

<https://github.com/PJLab-ADG/SensorsCalibration/blob/master/lidar2camera/README.md>

Developed by [AutoCore](https://autocore.ai/), an easy-to-use lightweight toolkit for Lidar-camera-calibration is proposed. Only in three steps, a fully automatic calibration will be done.

<https://github.com/autocore-ai/calibration_tools/tree/main/lidar-cam-calib-related>

### Lidar-IMU calibration

Developed by [APRIL Lab](https://github.com/APRIL-ZJU) at Zhejiang University in China, the LI-Calib calibration tool is a toolkit for calibrating the 6DoF rigid transformation and the time offset between a 3D LiDAR and an IMU, based on continuous-time batch optimization.
IMU-based cost and LiDAR point-to-surfel (surfel = surface element) distance are minimized jointly, which renders the calibration problem well-constrained in general scenarios.

[AutoCore](https://autocore.ai/) has forked the original LI-Calib tool and overwritten the Lidar input for more general usage. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the [LI-Calib fork on GitHub](https://github.com/autocore-ai/calibration_tools/tree/main/li_calib).
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ The created `ground_plane.launch.xml` and `ground_plane_sensor_kit.launch.xml` a
provided from TIER IV.

Then we will continue with adding vehicle_id and sensor model names to the `ground_plane.launch.xml`.
(Optionally, values are not important. These parameters Overrode from launch argument)
(Optionally, values are not important. These parameters will be overridden by launch arguments)

```diff
+ <?xml version="1.0" encoding="UTF-8"?>
Expand Down Expand Up @@ -101,7 +101,7 @@ The final version of the file (ground_plane.launch.xml) for tutorial_vehicle sho
After the completing of ground_plane.launch.xml file,
we will be ready to implement ground_plane_sensor_kit.launch.xml for the own sensor model.

Optionally, (don't forget, these parameters overrode by ROS 2 launch arguments.)
Optionally, (don't forget, these parameters will be overridden by launch arguments.)
you can modify sensor_kit and vehicle_id as `ground_plane.launch.xml`over this xml snippet:
(You can change rviz_profile path after the saving rviz config as video
which included at the end of the page)
Expand Down Expand Up @@ -316,5 +316,5 @@ you can see the sensor_kit_calibration.yaml in your $HOME directory after the ca
| :--------------------------------------------------------: | :-------------------------------------------------------------: |
| ![before-ground-plane.png](images/before-ground-plane.png) | ![images/after-ground-plane.png](images/after-ground-plane.png) |

Here is the video for demonstrating a ground plane - lidar calibration process on tutorial_vehicle:
Here is the video for demonstrating the ground plane - lidar calibration process on tutorial_vehicle:
![type:video](https://youtube.com/embed/EqaF1fufjUc)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,111 @@ These parameters include focal length, optical center, and lens distortion coeff
In order to perform camera Intrinsic calibration,
we will use TIER IV's [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) tool.
First of all, we need a calibration board which can be dot, chess or apriltag grid board.
In this tutorial, we will use a 7x7 chess board consisting of 7 cm squares.
In this tutorial, we will use this 7x7 chess board consisting of 7 cm squares:

<figure markdown>
![chess_board](images/chess-board-7x7.png){ width="500"}
<figcaption>
Our 7x7 calibration chess board for this tutorial section.
</figcaption>
</figure>

Here are some calibration board samples from [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) page:

- Chess boards ([6x8 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/checkerboard_8x6.pdf))
- Circle dot boards ([6x8 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/circle_8x6.pdf))
- Apriltag grid board ([3x4 example](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/resource/apriltag_grid_3x4.pdf))

If you want to use bag file for a calibration process,
the bag file must include `image_raw` topic of your camera sensor,
but you can perform calibration with real time.
(recommended)

??? note "ROS 2 Bag example for intrinsic camera calibration process"

```sh

Files: rosbag2_2023_09_18-16_19_08_0.db3
Bag size: 12.0 GiB
Storage id: sqlite3
Duration: 135.968s
Start: Sep 18 2023 16:19:08.966 (1695043148.966)
End: Sep 18 2023 16:21:24.934 (1695043284.934)
Messages: 4122
Topic information: Topic: /sensing/camera/camera0/image_raw | Type: sensor_msgs/msg/Image | Count: 2061 | Serialization Format: cdr

```

## Intrinsic camera calibration

Unlike other calibration packages in our tutorials,
this package does not need to create an initialization file.
So we can start with launching intrinsic calibrator package.

```bash
cd <YOUR-OWN-AUTOWARE-DIRECTORY>
source install/setup.bash
```

After that, we will launch intrinsic calibrator:

```bash
ros2 launch intrinsic_camera_calibrator calibrator.launch.xml
```

Then, initial configuration and camera intrinsic calibration panels will show up.
We set initial configurations for our calibration.

<figure markdown>
![chess_board](images/intrinsic-initial-configuration.png){ width="200"}
<figcaption>
Initial configuration panel
</figcaption>
</figure>

We set our image source (it can be "ROS topic", "ROS bag" or "Image files")
from the source options section.
We will calibrate our camera with "ROS topic" source.
After selecting an image source from this panel, we need to configure "Board options" as well.
The calibration board can be "Chess board", "Dot board" or "Apriltag".
Also, we need to select board parameters,
to do that, click the "Board parameters" button and set row, column, and cell size.

After the setting of image source and board parameters, we are ready for the calibration process.
Please click the start button, you will see "Topic configuration" panel.
Please select the appropriate camera raw topic for the calibration process.

<figure markdown>
![chess_board](images/intrinsic-topic-information.png){ width="200"}
<figcaption>
Topic configuration panel
</figcaption>
</figure>

Then you are ready to calibration.
Please collect data with different X-Y axis, sizes and skews.
You can see your collected data statistics with the clicking view data collection statistics.
For more information,
please refer to [Intrinsic Camera Calibrator](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_intrinsic_camera.md) page.

<figure markdown>
![chess_board](images/intrinsic-data-collecting.png){ align=center}
<figcaption>
Intrinsic calibrator interface
</figcaption>
</figure>

After the data collection is completed,
you can click the "Calibrate" button to perform the calibration process.
After the calibration is completed,
you will see data visualization for the calibration result statistics.
You can observe your calibration results with changing "Image view type "Source unrectified"
to "Source rectified".
If your calibration is successful (there should be no distortion in the rectified image),
you can save your calibration results with "Save" button.
The output will be named as `<YOUR-CAMERA-NAME>_info.yaml`.
So, you use this file with your camera driver directly.

Here is the video
for demonstrating the intrinsic camera calibration process on tutorial_vehicle:
![type:video](https://youtube.com/embed/jN77AdGFrGU)
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ The created `interactive.launch.xml` and `interactive_sensor_kit.launch.xml` are
[aip_xx1](https://github.com/tier4/CalibrationTools/tree/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_xx1) provided from TIER IV.

Then we will continue with adding vehicle_id and sensor model names to the `mapping_based.launch.xml`.
(Optionally, values are not important. These parameters Overrode from launch argument) Then,
(Optionally, values are not important. These parameters will be overridden by launch arguments) Then,
we will add camera_name for calibrating camera
(can be one of the camera0, camera1, camera_front etc. as launch argument)
and `use_concatenated_pointcloud` argument.
Expand Down Expand Up @@ -123,7 +123,7 @@ The final version of the file (interactive.launch.xml) for tutorial_vehicle shou
After the completing of interactive.launch.xml file,
we will be ready to implement interactive_sensor_kit.launch.xml for the own sensor model.

Optionally, (don't forget, these parameters overrode by ROS 2 launch arguments.)
Optionally, (don't forget, these parameters will be overridden by launch arguments.)
you can modify sensor_kit and vehicle_id as `interactive.launch.xml`over this xml snippet.
We will set parent_frame for calibration as `sensor_kit_base_link``:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,6 @@ we will explain lidar-lidar calibration over [mapping-based lidar-lidar calibrat
CalibrationTools.
Also,
[map-based lidar-lidar calibration method](https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_map_based.md) included in the TIER IV's tools.
[LL-Calib](https://github.com/autocore-ai/calibration_tools/tree/main/lidar-lidar-calib) on GitHub,
provided by [AutoCore](https://autocore.ai/),
is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration.
It's based on local mapping and "GICP" method to derive the relation between main and sub lidar.
If you want more details about these methods such as usage, troubleshooting etc. please check the links above.

!!! warning

Expand Down Expand Up @@ -50,7 +45,7 @@ The created `mapping_based.launch.xml` and `mapping_based_sensor_kit.launch.xml`
provided from TIER IV.

Then we will continue with adding vehicle_id and sensor model names to the `mapping_based.launch.xml`.
(Optionally, values are not important. These parameters Overrode from launch argument)
(Optionally, values are not important. These parameters will be overridden by launch arguments)

```diff
+ <?xml version="1.0" encoding="UTF-8"?>
Expand Down Expand Up @@ -204,7 +199,7 @@ calibration_lidar_base_frames and calibration_lidar_frames for calibrator.
livox_front_right]"/>
```

??? note "i.e., At the tutorial_vehicle it should be like this snippet."
??? note "i.e., At the tutorial_vehicle it should be like this snippet"

```xml
+ <let
Expand Down Expand Up @@ -476,5 +471,5 @@ The green points indicate aligned point (calibration result).
The calibration results will be saved automatically on your
`dst_yaml` ($HOME/sensor_kit_calibration.yaml) at this tutorial.

Here is the video for demonstrating a mapping-based lidar-lidar calibration process on tutorial_vehicle:
Here is the video for demonstrating the mapping-based lidar-lidar calibration process on tutorial_vehicle:
![type:video](https://youtube.com/embed/--WBNP76GoE)
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,8 @@ So, we can start modifying `manual.launch.xml`,
please open this file on a text editor which will you prefer (code, gedit etc.).
Example for out tutorial vehicle should be like these steps:

Let's start with adding vehicle_id and sensor model names. (Optionally, values are not important. These parameters Overrode from launch argument)
Let's start with adding vehicle_id and sensor model names.
(Optionally, values are not important. These parameters will be overridden by launch arguments)

```diff
+ <?xml version="1.0" encoding="UTF-8"?>
Expand Down

0 comments on commit e43c732

Please sign in to comment.