Skip to content

Commit

Permalink
Prepare the code for the first release
Browse files Browse the repository at this point in the history
  • Loading branch information
GiulioRomualdi committed Nov 29, 2024
1 parent 4d2d60a commit 54739cb
Show file tree
Hide file tree
Showing 4 changed files with 106 additions and 83 deletions.
17 changes: 17 additions & 0 deletions .github/workflows/docker.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
name: Publish Docker Image

on: [workflow_dispatch]

jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Publish to Registry
uses: elgohr/Publish-Docker-Github-Action@v5
with:
name: ami-iit/dnn-mpc-walking-docker
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
workdir: dockerfiles
registry: ghcr.io
2 changes: 1 addition & 1 deletion CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ cmake_minimum_required(VERSION 3.14)

## MAIN project
project(DNNMPCWalking
VERSION 0.0.1)
VERSION 1.0.0)

list(APPEND CMAKE_MODULE_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake)

Expand Down
169 changes: 87 additions & 82 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,126 +3,131 @@ Online DNN-Driven Nonlinear MPC for Stylistic Humanoid Robot Walking with Step A
</h1>

<div align="center">
Giulio Romualdi, Paolo Maria Viceconte, Lorenzo Moretti, Ines Sorrentino, Stefano Dafarra, Silvio Traversaro, and Daniele Pucci
<br>
<b>Paolo Maria Viceconte and Giulio Romualdi are co-first authors</b>
Giulio Romualdi, Paolo Maria Viceconte, Lorenzo Moretti, Ines Sorrentino, Stefano Dafarra, Silvio Traversaro, and Daniele Pucci
<br>
<b>Co-first authors: Paolo Maria Viceconte and Giulio Romualdi</b>
</div>
<br>

<div align="center">
📅 This paper has been accepted for publication at the <b>2024 IEEE-RAS International Conference on Humanoid Robots</b> (Humanoids), Nancy, France 🤖
📅 Accepted for publication at the <b>2024 IEEE-RAS International Conference on Humanoid Robots</b> (Humanoids), Nancy, France 🤖
</div>
<br>

<div align="center">
<a href="https://arxiv.org/abs/2410.07849"><b>📚 Paper</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://www.youtube.com/watch?v=x3tzEfxO-xQ"><b>🎥 Video</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking/blob/main/misc/poster/humanoids-2024-dnn-mpc.pdf"><b>🖼️ Poster</b></a> &nbsp;&nbsp;&nbsp;
<a href="#reproducing-the-experiments"><b>🔧 Experiments</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://sites.google.com/view/dnn-mpc-walking/home-page"><b>🌐 Website</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://huggingface.co/datasets/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking_dataset"><b>📂 Dataset</b></a>
<a href="https://arxiv.org/abs/2410.07849"><b>📚 Paper</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://www.youtube.com/watch?v=x3tzEfxO-xQ"><b>🎥 Video</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking/blob/main/misc/poster/humanoids-2024-dnn-mpc.pdf"><b>🖼️ Poster</b></a> &nbsp;&nbsp;&nbsp;
<a href="#reproducing-the-experiments"><b>🔧 Experiments</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://sites.google.com/view/dnn-mpc-walking/home-page"><b>🌐 Website</b></a> &nbsp;&nbsp;&nbsp;
<a href="https://huggingface.co/datasets/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking_dataset"><b>📂 Dataset</b></a>
</div>

<br>

https://github.com/user-attachments/assets/404c9af8-528e-43c2-abd2-138a98adfc04

<br>
---

## Reproducing the Experiments

You can reprocue the experiments in two ways: either with conda or pixi.

### Conda

You can reproduce the experiments using **Docker**, **Conda**, or **Pixi**.

To reproduce the experiments, we provide a conda environment for easy setup. Follow the steps below:
### Docker

#### 1. Install the Environment
Run the following command to create the conda environment:
Run the experiments via Docker for an isolated and reproducible environment.

```bash
conda env create -f environment.yml
```
1. Pull the Docker image:
```bash
docker pull ghcr.io/ami-iit/dnn-mpc-walking-docker:latest
```

#### 2. Activate the Environment
Activate the newly created environment:
2. Launch the container:
```bash
xhost +
docker run -it --rm \
--device=/dev/dri:/dev/dri \
--env="DISPLAY=$DISPLAY" \
--net=host \
ghcr.io/ami-iit/dnn-mpc-walking-docker:latest
```

```bash
conda activate dnn-mpc-env
```
3. Wait for `Gazebo` to start and launch the experiment.

#### 3. Run the Simulation
Start the experiment with:
> ⚠️ **Known Issue:** The Gazebo real-time factor is scaled by a factor of 10 due to the MUMPS linear solver in the IPOPT Docker image. Alternative solvers (e.g., MA97) are available but cannot be redistributed.
```bash
./run_simulation.sh
```

This script will:
- Launch the Gazebo simulator
- Start the YARP server (for simulator communication)
- Initialize the DNN-driven MPC controller
---

When prompted, type `y` and press `Enter` to start the simulation. The humanoid robot will begin walking, and you can observe its behavior in the Gazebo simulator.
### Conda

⚠️ **Known Issue:** The Gazebo real-time factor is scaled by a factor of 10. This is due to the use of the MUMPS linear solver in the IPOPT docker image. Alternative solvers (e.g., MA27) are available but cannot be redistributed.
Follow these steps to set up the experiments using Conda:

1. Install the environment:
```bash
conda env create -f environment.yml
```

### Pixi
2. Activate the environment:
```bash
conda activate dnn-mpc-env
```

To run the experiments with pixi on Linux, just download the repo and run:
3. Compile the code:
```bash
cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
mkdir build && cd build
cmake ..
make -j
make install
```

~~~
git clone https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
pixi run -e default run_simulation
~~~
4. Run the simulation:
```bash
./run_simulation.sh
```

This command will install all the dependencies, compiled the code and run the simulation. At that point, when prompted, type `y` and press `Enter` to start the simulation. The humanoid robot will begin walking, and you can observe its behavior in the Gazebo simulator.
> ⚠️ The Gazebo real-time factor is scaled by a factor of 10 due to the MUMPS linear solver.
⚠️ **Known Issue:** The Gazebo real-time factor is scaled by a factor of 10. This is due to the use of the MUMPS linear solver in the IPOPT docker image. If you have have access to a Coin-HSL license, you can use it following the instructions in the following, to reduce the Gazebo real-time factor scaling from 10 to 2.

To run the simulation using the Coin-HSL's `ma97` solver, follow the following steps:
---

1. Go to https://licences.stfc.ac.uk/product/coin-hsl, and:
- If you already have a license for Coin-HSL:
- Go to https://licences.stfc.ac.uk/account/orders and find the coin-hsl order.
- Download the coinhsl-2023.11.17.zip file and place it in the './coinhsl_src' folder of this repository.
- If you do not have a license for Coin-HSL:
- If you are an academic, request a license at https://licences.stfc.ac.uk/product/coin-hsl.
- If you are not an academic, purchase a license at https://licences.stfc.ac.uk/product/coin-hsl.
- Once your order is approved, download the coinhsl-2023.11.17.zip file and place it in the './coinhsl_src' folder.
### Pixi

Once the `coinhsl-2023.11.17.zip` archive is in the './coinhsl_src' folder, just run:
To run the experiments with Pixi:

~~~
pixi run -e coinhsl run_simulation
~~~
1. Clone the repository:
```bash
git clone https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
```

This will execute the same steps of running `pixi run -e default run_simulation`, but additionally it will:
* compile `coinhsl` to be able to use the `ma97` linear solver,
* it will modify the configuration files to use `ma97`
* use a different Gazebo world model, to ensure a faster simulation.
2. Run the simulation:
```bash
pixi run -e default run_simulation
```

> **Using MA97 Solver (Optional):**
> If you have access to the Coin-HSL license, you can use the MA97 solver to improve performance:
> 1. Obtain the Coin-HSL archive (`coinhsl-2023.11.17.zip`) and place it in the `./coinhsl_src` folder.
> 2. Run:
> ```bash
> pixi run -e coinhsl run_simulation
> ```
---
## Maintainers
<table align="left">
<tr>
<td align="center">
<a href="https://github.com/paolo-viceconte">
<img src="https://github.com/paolo-viceconte.png" width="40" alt="Paolo Maria Viceconte"><br>
👨‍💻 @paolo-viceconte
</a>
</td>
<td align="center">
<a href="https://github.com/GiulioRomualdi">
<img src="https://github.com/GiulioRomualdi.png" width="40" alt="Giulio Romualdi"><br>
👨‍💻 @GiulioRomualdi
</a>
</td>
</tr>
<table>
<tr>
<td align="center">
<a href="https://github.com/paolo-viceconte">
<img src="https://github.com/paolo-viceconte.png" width="80" alt="Paolo Maria Viceconte"><br>
👨‍💻 Paolo Maria Viceconte
</a>
</td>
<td align="center">
<a href="https://github.com/GiulioRomualdi">
<img src="https://github.com/GiulioRomualdi.png" width="80" alt="Giulio Romualdi"><br>
👨‍💻 Giulio Romualdi
</a>
</td>
</tr>
</table>
1 change: 1 addition & 0 deletions dockerfiles/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ WORKDIR /workspace
# Clone the repository with the submodules compile it and install it in the conda environment
RUN git clone https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking.git && \
cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking && \
git checkout v1.0.0 && \
mkdir build && \
cd build && \
cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/opt/conda/envs/dnn-mpc-env .. && \
Expand Down

0 comments on commit 54739cb

Please sign in to comment.