Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: fix pre-commit config and adapt code to make all checks pass #162

Merged
merged 25 commits into from
Jun 26, 2024
Merged
Changes from 1 commit
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
8a09f5e
chore: fix pre-commit versions and clang-format hook failure
mojomex Jun 6, 2024
c442c8c
chore(docs): fix documentation pre-commit errors
mojomex Jun 6, 2024
084f172
chore: fix pre-commit and compiler warnings in all code files modifie…
mojomex Jun 6, 2024
2973053
chore(yamllint): ignore yaml document start in clang-format config as…
mojomex Jun 6, 2024
2bdae1d
chore: add copyright and license headers
mojomex Jun 6, 2024
83e9a85
chore(pre-commit): allow multiple documents in YAML files to make .cl…
mojomex Jun 6, 2024
cfe3d0c
chore: make pre-commit pass for parameter schema code
mojomex Jun 7, 2024
66783f5
chore: add copyright and license to all source files
mojomex Jun 7, 2024
f080524
chore: implement pre-commit suggestions in all CPP/HPP files
mojomex Jun 7, 2024
d633414
chore: whitespace changes in non-cpp files to satisfy pre-commit
mojomex Jun 7, 2024
7adf53e
chore: flake8 changes to satisfy pre-commit
mojomex Jun 7, 2024
fa95f61
fix: allow implicit conversion to status types again
mojomex Jun 7, 2024
9cfe040
chore: clean up imports
mojomex Jun 7, 2024
6748ca9
chore: add override/inline where necessary
mojomex Jun 7, 2024
3f0f35a
chore(nebula_ros): remove obsolete wrapper base types
mojomex Jun 7, 2024
3032f0b
chore: move nolint comments to be robust to formatting linebreaks
mojomex Jun 7, 2024
ff3a6a6
chore(velodyne): fix indentation in velodyne calibration files to sat…
mojomex Jun 7, 2024
b4aa564
chore(hesai): fix decoder test config yaml quotations
mojomex Jun 7, 2024
1bc62e3
chore: whitespace changes
mojomex Jun 7, 2024
592f88d
chore: remove empty, un-parseable local.cspell.json
mojomex Jun 7, 2024
950dce5
chore(prettier): ignore yaml and json files as they are handled by ya…
mojomex Jun 7, 2024
f79f42f
chore: make pre-commit pass on new files after #146
mojomex Jun 7, 2024
cf44e62
chore: update cspell to pass spell-check
mojomex Jun 7, 2024
da0bb68
chore: rename contributing.md docs page to make failing link check pass
mojomex Jun 7, 2024
dbada66
Merge remote-tracking branch 'origin/develop' into pre-commit-fixes
mojomex Jun 26, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
chore: whitespace changes in non-cpp files to satisfy pre-commit
mojomex committed Jun 7, 2024
commit d6334148e4b283350e8de8d7222db30f99db52a5
1 change: 0 additions & 1 deletion .github/workflows/build-and-test-differential.yaml
Original file line number Diff line number Diff line change
@@ -54,4 +54,3 @@ jobs:
fail_ci_if_error: false
verbose: true
flags: differential

6 changes: 5 additions & 1 deletion .markdown-link-check.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
{
"aliveStatusCodes": [200, 206, 403],
"aliveStatusCodes": [
200,
206,
403
],
"ignorePatterns": [
{
"pattern": "^http://localhost"
78 changes: 42 additions & 36 deletions docs/hesai_decoder_design.md
Original file line number Diff line number Diff line change
@@ -11,17 +11,19 @@ This way, runtime overhead for this generalization is `0`.
### Packet formats

For all handled Hesai sensors, the packet structure follows this rough format:

1. (optional) header: static sensor info and feature flags
2. body: point data
3. tail and other appendices: timestamp, operation mode info

### Decoding steps

For all handled Hesai sensors, decoding a packet follows these steps:

```python
def unpack(packet):
parse_and_validate(packet)
# return group: one (single-return) or more (multi-return)
# return group: one (single-return) or more (multi-return)
# blocks that belong to the same azimuth
for return_group in packet:
if is_start_of_new_scan(return_group):
@@ -40,17 +42,18 @@ def decode(return_group):
append to pointcloud
```

The steps marked with __*__ are model-specific:
The steps marked with **\*** are model-specific:

* angle correction
* timing correction
* return type assignment
- angle correction
- timing correction
- return type assignment

### Angle correction

There are two approaches between all the supported sensors:
* Calibration file based
* Correction file based (currently only used by AT128)

- Calibration file based
- Correction file based (currently only used by AT128)

For both approaches, sin/cos lookup tables can be computed.
However, the resolution and calculation of these tables is different.
@@ -60,7 +63,7 @@ However, the resolution and calculation of these tables is different.
For each laser channel, a fixed elevation angle and azimuth angle offset are defined in the calibration file.
Thus, sin/cos for elevation are only a function of the laser channel (not dependent on azimuth) while those for azimuth are a function of azimuth AND elevation.

Lookup tables for elevation can thus be sized with `n_channels`, yielding a maximum size of
Lookup tables for elevation can thus be sized with `n_channels`, yielding a maximum size of
`128 * sizeof(float) = 512B` each.

For azimuth, the size is `n_channels * n_azimuths = n_channels * 360 * azimuth_resolution <= 128 * 36000`.
@@ -94,9 +97,10 @@ While there is a wide range of different supported return modes (e.g. single (fi
Differences only arise in multi-return (dual or triple) in the output order of the returns, and in the handling of some returns being duplicates (e.g. in dual(first, strongest), the first return coincides with the strongest one).

Here is an exhaustive list of differences:
* For Dual (First, Last) `0x3B`, 128E3X, 128E4X and XT32 reverse the output order (Last, First)
* For Dual (Last, Strongest) `0x39`, all sensors except XT32M place the second strongest return in the even block if last == strongest
* For Dual (First, Strongest) `0x3c`, the same as for `0x39` holds.

- For Dual (First, Last) `0x3B`, 128E3X, 128E4X and XT32 reverse the output order (Last, First)
- For Dual (Last, Strongest) `0x39`, all sensors except XT32M place the second strongest return in the even block if last == strongest
- For Dual (First, Strongest) `0x3c`, the same as for `0x39` holds.

For all other return modes, duplicate points are output if the two returns coincide.

@@ -119,9 +123,10 @@ Return mode handling has a default implementation that is supplemented by additi
### `AngleCorrector`

The angle corrector has three main tasks:
* compute corrected azimuth/elevation for given azimuth and channel
* implement `hasScanCompleted()` logic that decides where one scan ends and the next starts
* compute and provide lookup tables for sin/cos/etc.

- compute corrected azimuth/elevation for given azimuth and channel
- implement `hasScanCompleted()` logic that decides where one scan ends and the next starts
- compute and provide lookup tables for sin/cos/etc.

The two angle correction types are calibration-based and correction-based. In both approaches, a file from the sensor is used to extract the angle correction for each azimuth/channel.
For all approaches, cos/sin lookup tables in the appropriate size are generated (see requirements section above).
@@ -133,9 +138,10 @@ It is a template class taking a sensor type `SensorT` from which packet type, an
Thus, this unified decoder is an almost zero-cost abstraction.

Its tasks are:
* parsing an incoming packet
* managing decode/output point buffers
* converting all points in the packet using the sensor-specific functions of `SensorT` where necessary

- parsing an incoming packet
- managing decode/output point buffers
- converting all points in the packet using the sensor-specific functions of `SensorT` where necessary

`HesaiDecoder<SensorT>` is a subclass of the existing `HesaiScanDecoder` to allow all template instantiations to be assigned to variables of the supertype.

@@ -144,32 +150,32 @@ Its tasks are:
To support a new sensor model, first familiarize with the already implemented decoders.
Then, consult the new sensor's datasheet and identify the following parameters:

| Parameter | Chapter | Possible values | Notes |
|-|-|-|-|
| Header format | 3.1 | `Header12B`, `Header8B`, ... | `Header12B` is the standard and comprises the UDP pre-header and header (6+6B) mentioned in the data sheets |
| Blocks per packet | 3.1 | `2`, `6`, `10`, ... | |
| Number of channels | 3.1 | `32`, `40`, `64`, ... | |
| Unit format | 3.1 | `Unit3B`, `Unit4B`, ... | |
| Angle correction | App. 3 | `CALIBRATION`, `CORRECTION`, ... | The datasheet usually specifies whether a calibration/correction file is used |
| Timing correction | App. 2 | | There is usually a block and channel component. These come in the form of formulas/lookup tables. For most sensors, these depend on return mode and for some, features like high resolution mode, alternate firing etc. might change the timing |
| Return type handling | 3.1 | | Return modes are handled identically for most sensors but some re-order the returns or replace returns if there are duplicates |
| Bytes per second | 1.4 | | |
| Lowest supported frequency | 1.4 | `5 Hz`, `10 Hz`, ... | |

| Chapter | Full title |
|-|-|
|1.4| Introduction > Specifications|
|3.1| Data Structure > Point Cloud Data Packet|
|App. 2| Absolute Time of Point Cloud Data|
|App. 3| Angle Correction|
| Parameter | Chapter | Possible values | Notes |
| -------------------------- | ------- | -------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Header format | 3.1 | `Header12B`, `Header8B`, ... | `Header12B` is the standard and comprises the UDP pre-header and header (6+6B) mentioned in the data sheets |
| Blocks per packet | 3.1 | `2`, `6`, `10`, ... | |
| Number of channels | 3.1 | `32`, `40`, `64`, ... | |
| Unit format | 3.1 | `Unit3B`, `Unit4B`, ... | |
| Angle correction | App. 3 | `CALIBRATION`, `CORRECTION`, ... | The datasheet usually specifies whether a calibration/correction file is used |
| Timing correction | App. 2 | | There is usually a block and channel component. These come in the form of formulas/lookup tables. For most sensors, these depend on return mode and for some, features like high resolution mode, alternate firing etc. might change the timing |
| Return type handling | 3.1 | | Return modes are handled identically for most sensors but some re-order the returns or replace returns if there are duplicates |
| Bytes per second | 1.4 | | |
| Lowest supported frequency | 1.4 | `5 Hz`, `10 Hz`, ... | |

| Chapter | Full title |
| ------- | ---------------------------------------- |
| 1.4 | Introduction > Specifications |
| 3.1 | Data Structure > Point Cloud Data Packet |
| App. 2 | Absolute Time of Point Cloud Data |
| App. 3 | Angle Correction |

With this information, create a `PacketMySensor` struct and `SensorMySensor` class.
Reuse already-defined structs as much as possible (c.f. `Packet128E3X` and `Packet128E4X`).

Implement timing correction in `SensorMySensor` and define the class constants `float MIN_RANGE`,
`float MAX_RANGE` and `size_t MAX_SCAN_BUFFER_POINTS`.
The former two are used for filtering out too-close and too-far away points while the latter is used to
allocate pointcloud buffers.
allocate pointcloud buffers.
Set `MAX_SCAN_BUFFER_POINTS = bytes_per_second / lowest_supported_frequency` from the parameters found above.

If there are any non-standard features your sensor has, implement them as generically as possible to allow for future sensors to re-use your code.
2 changes: 1 addition & 1 deletion nebula_messages/robosense_msgs/msg/RobosenseInfoPacket.msg
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
string lidar_model
RobosensePacket packet
RobosensePacket packet
2 changes: 1 addition & 1 deletion nebula_messages/robosense_msgs/msg/RobosensePacket.msg
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
builtin_interfaces/Time stamp
uint8[1248] data
uint8[1248] data
2 changes: 1 addition & 1 deletion nebula_messages/robosense_msgs/msg/RobosenseScan.msg
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
std_msgs/Header header
RobosensePacket[] packets
RobosensePacket[] packets
28 changes: 14 additions & 14 deletions nebula_messages/robosense_msgs/package.xml
Original file line number Diff line number Diff line change
@@ -1,22 +1,22 @@
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>robosense_msgs</name>
<version>0.0.0</version>
<description>Robosense message types for Nebula</description>
<maintainer email="memin@leodrive.ai">Mehmet Emin BASOGLU</maintainer>
<license>TODO: License declaration</license>
<name>robosense_msgs</name>
<version>0.0.0</version>
<description>Robosense message types for Nebula</description>
<maintainer email="memin@leodrive.ai">Mehmet Emin BASOGLU</maintainer>
<license>TODO: License declaration</license>

<buildtool_depend>ament_cmake_auto</buildtool_depend>
<buildtool_depend>rosidl_default_generators</buildtool_depend>
<buildtool_depend>ament_cmake_auto</buildtool_depend>
<buildtool_depend>rosidl_default_generators</buildtool_depend>

<depend>builtin_interfaces</depend>
<depend>std_msgs</depend>
<depend>builtin_interfaces</depend>
<depend>std_msgs</depend>

<exec_depend>rosidl_default_runtime</exec_depend>
<member_of_group>rosidl_interface_packages</member_of_group>
<exec_depend>rosidl_default_runtime</exec_depend>
<member_of_group>rosidl_interface_packages</member_of_group>

<export>
<build_type>ament_cmake</build_type>
</export>
<export>
<build_type>ament_cmake</build_type>
</export>
</package>
Original file line number Diff line number Diff line change
@@ -23,4 +23,4 @@ rosbag2_bagfile_information:
nanoseconds_since_epoch: 1713492677464078412
duration:
nanoseconds: 376987098
message_count: 5
message_count: 5
137 changes: 65 additions & 72 deletions scripts/profiling_runner.bash
Original file line number Diff line number Diff line change
@@ -2,10 +2,9 @@

# Configuration

print_usage () {
print_usage() {
# Print optionally passed error message
if [ -n "$1" ]
then
if [ -n "$1" ]; then
echo "$1"
fi
echo "Usage: $0 <run_name> [-t|--timeout <timeout>] [-m|--sensor-model <sensor_model>] [-n|--n-runs <n_runs>] [-f|--maxfreq <maxfreq>] [-c|--taskset-cores <core1[,core2,...]>] [-b|--rosbag-path <rosbag_path>]"
@@ -18,15 +17,14 @@ print_usage () {
}

# Default parameters
runtime=20 # seconds (rosbag runtime is timeout-3s for startup)
n_runs=3 # number of runs per test
maxfreq=2300000 # 2.3 GHz
n_cores=$(nproc --all) # number of cores of the CPU
taskset_cores=0-$(( $n_cores - 1 )) # cores to run Nebula on
nebula_dir=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )/.." &> /dev/null && pwd )

if [ $# -lt 1 ]
then
runtime=20 # seconds (rosbag runtime is timeout-3s for startup)
n_runs=3 # number of runs per test
maxfreq=2300000 # 2.3 GHz
n_cores=$(nproc --all) # number of cores of the CPU
taskset_cores=0-$((n_cores - 1)) # cores to run Nebula on
nebula_dir=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")/.." &>/dev/null && pwd)

if [ $# -lt 1 ]; then
print_usage "No run name specified"
exit 1
fi
@@ -35,62 +33,59 @@ run_name=$1
shift

#do the parsing
while [[ $# -gt 0 ]]
do
while [[ $# -gt 0 ]]; do
key="$1"

case $key in
-t|--runtime)
runtime="$2"
shift
shift
;;
-m|--sensor-model)
sensor_model="$2"
shift
shift
;;
-n|--n-runs)
n_runs="$2"
shift
shift
;;
-f|--maxfreq)
maxfreq="$2"
shift
shift
;;
-c|--taskset-cores)
taskset_cores="$2"
shift
shift
;;
-d|--nebula-dir)
nebula_dir="$2"
shift
shift
;;
-b|--rosbag-path)
rosbag_path="$2"
shift
shift
;;
*)
print_usage "Unknown option $1"
exit 1
;;
-t | --runtime)
runtime="$2"
shift
shift
;;
-m | --sensor-model)
sensor_model="$2"
shift
shift
;;
-n | --n-runs)
n_runs="$2"
shift
shift
;;
-f | --maxfreq)
maxfreq="$2"
shift
shift
;;
-c | --taskset-cores)
taskset_cores="$2"
shift
shift
;;
-d | --nebula-dir)
nebula_dir="$2"
shift
shift
;;
-b | --rosbag-path)
rosbag_path="$2"
shift
shift
;;
*)
print_usage "Unknown option $1"
exit 1
;;
esac
done
# Enf of configuration

if [ -z "$rosbag_path" ]
then
if [ -z "$rosbag_path" ]; then
print_usage "No --rosbag-path specified"
exit 1
fi

if [ -z "$sensor_model" ]
then
if [ -z "$sensor_model" ]; then
print_usage "No --sensor-model specified"
exit 1
fi
@@ -100,24 +95,22 @@ output_dir="$nebula_dir/profiling_output"
mkdir -p "$output_dir"
echo "Outputting to '$output_dir/$run_name-[1-$n_runs].log'"

cd "$nebula_dir"
cd "$nebula_dir" || exit

lockfreq () {
for (( i=0; i<$n_cores; i++ ))
do
echo userspace | sudo tee /sys/devices/system/cpu/cpufreq/policy$i/scaling_governor > /dev/null
echo $maxfreq | sudo tee /sys/devices/system/cpu/cpufreq/policy$i/scaling_setspeed > /dev/null
lockfreq() {
for ((i = 0; i < n_cores; i++)); do
echo userspace | sudo tee /sys/devices/system/cpu/cpufreq/policy$i/scaling_governor >/dev/null
echo "$maxfreq" | sudo tee /sys/devices/system/cpu/cpufreq/policy$i/scaling_setspeed >/dev/null
echo -n "CPU $i's frequency is: "
sudo cat /sys/devices/system/cpu/cpu$i/cpufreq/cpuinfo_cur_freq
done

sudo sh -c "echo 0 > /sys/devices/system/cpu/cpufreq/boost"
}

unlockfreq () {
for (( i=0; i<$n_cores; i++ ))
do
echo schedutil | sudo tee /sys/devices/system/cpu/cpufreq/policy$i/scaling_governor > /dev/null
unlockfreq() {
for ((i = 0; i < n_cores; i++)); do
echo schedutil | sudo tee /sys/devices/system/cpu/cpufreq/policy$i/scaling_governor >/dev/null
echo -n "CPU $i's frequency is: "
sudo cat /sys/devices/system/cpu/cpu$i/cpufreq/cpuinfo_cur_freq
done
@@ -129,6 +122,7 @@ echo "Setting up for compiling"
unlockfreq

colcon build --symlink-install --packages-up-to nebula_ros --cmake-args -DCMAKE_BUILD_TYPE=Release || exit 1
# shellcheck disable=SC1091
source "$nebula_dir/install/setup.bash"

ros2 daemon stop
@@ -137,15 +131,14 @@ ros2 daemon start
echo "Setting up for test run"
lockfreq

for (( i=1; i<=$n_runs; i++ ))
do
for ((i = 1; i <= n_runs; i++)); do
echo "Running iteration $i of $n_runs"
# set the log level to debug for all ros 2 nodes

timeout -s INT $runtime taskset -c "$taskset_cores" ros2 launch nebula_ros nebula_launch.py "sensor_model:=$sensor_model" launch_hw:=false debug_logging:=true > "$output_dir/$run_name-$i.log" 2>&1 &
(sleep 3 && timeout -s KILL $(( $runtime - 3 )) nohup ros2 bag play -l "$rosbag_path") > /dev/null &
timeout -s INT "$runtime" taskset -c "$taskset_cores" ros2 launch nebula_ros nebula_launch.py "sensor_model:=$sensor_model" launch_hw:=false debug_logging:=true >"$output_dir/$run_name-$i.log" 2>&1 &
(sleep 3 && timeout -s KILL $((runtime - 3)) nohup ros2 bag play -l "$rosbag_path") >/dev/null &
wait
done

echo "Unlocking frequency"
unlockfreq
unlockfreq