Skip to content

Latest commit

 

History

History
125 lines (90 loc) · 4.06 KB

README.md

File metadata and controls

125 lines (90 loc) · 4.06 KB

🎉 T-ITS ACCEPTED! 🎊

📃 Safe Human-in-the-loop RL (SafeHiL-RL) with Shared Control for End-to-End Autonomous Driving

🔥 Source Code Released! 🔥

💫 As a pioneering work considering guidance safety within the human-in-the-loop RL paradigm, this work introduces a 🔥 curriculum guidance mechanism 🔥 inspired by the pedagogical principle of whole-to-part patterns in human education, aiming to standardize the intervention process of human participants.

🚗 SafeHil-RL is designed to prevent policy oscillations or divergence caused by inappropriate or degraded human guidance during interventions using the human-AI shared autonomy technique, thereby improving learning efficiency, robustness, and driving safety.

🔧 Realized in SMARTS simulator with Ubuntu 20.04 and Pytorch.

Email: [email protected]

Framework

Frenet-based Dynamic Potential Field (FDPF)

Demonstration (accelerated videos)

Lane-change Performance

sahil1_lanechange.mp4

Uncooperative Road User

sahil_uncooperated.mp4

Cooperative Road User

sahil_cooperated.mp4

Unobserved Road Structure

sahil_unobserved.mp4

User Guide

Clone the repository.

cd to your workspace and clone the repo.

git clone https://github.com/OscarHuangWind/Safe-Human-in-the-Loop-RL.git

Create a new Conda environment.

cd to your workspace:

conda env create -f environment.yml

Activate virtual environment.

conda activate safehil-rl

Install Pytorch

Select the correct version based on your cuda version and device (cpu/gpu):

pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113

Install the SMARTS.

# Download SMARTS

git clone https://github.com/huawei-noah/SMARTS.git

cd <path/to/SMARTS>

# Important! Checkout to comp-1 branch
git checkout comp-1

# Install the system requirements.
bash utils/setup/install_deps.sh

# Install smarts.
pip install -e '.[camera_obs,test,train]'

# Install extra dependencies.
pip install -e .[extras]

Build the scenario.

cd <path/to/Safe-Human-in-the-loop-RL>
scl scenario build --clean scenario/straight/

Visulazation

scl envision start

Then go to http://localhost:8081/

Training

Modify the sys path in main.py file, and run:

python main.py

Human Guidance

Change the model in main.py file to SaHiL/PHIL/HIRL, and run:

python main.py

Check the code in keyboard.py to get idea of keyboard control.

Alternatively, you can use G29 set to intervene the vehicle control, check the lines from 177 to 191 in main.py file for the details.

The "Egocentric View" is recommended for the human guidance.

Evaluation

Edit the mode in config.yaml as evaluation and run:

python main.py