Skip to content

Commit

Permalink
Merge branch 'main' into feature/layout-support
Browse files Browse the repository at this point in the history
  • Loading branch information
kozlov721 authored Jul 24, 2024
2 parents bafeb7b + 0e95d50 commit a154f78
Show file tree
Hide file tree
Showing 20 changed files with 194 additions and 134 deletions.
28 changes: 19 additions & 9 deletions .github/workflows/modelconverter_test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,17 +13,27 @@ permissions:
packages: read

env:
DOCKERFILE: docker/${{ inputs.package }}/Dockerfile
TAG: luxonis/modelconverter-${{ inputs.package }}:latest
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_S3_ENDPOINT_URL: ${{ secrets.AWS_S3_ENDPOINT_URL }}
PACKAGE: ${{ inputs.package }}

jobs:
tests:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-13]

runs-on: ${{ matrix.os }}

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Docker
uses: crazy-max/ghaction-setup-docker@v3

- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
Expand All @@ -47,12 +57,12 @@ jobs:
credentials_json: ${{ secrets.GCP_CREDENTIALS }}
token_format: access_token

- name: Run Tests
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_S3_ENDPOINT_URL: ${{ secrets.AWS_S3_ENDPOINT_URL }}
PACKAGE: ${{ inputs.package }}
- name: Run Tests (Ubuntu)
if: ${{ matrix.os == 'ubuntu-latest' }}
run: |
pytest -s --verbose "tests/test_packages/test_$PACKAGE.py"
- name: Run Tests (Windows)
if: ${{ matrix.os == 'windows-latest' }}
run: |
pytest -s --verbose "tests/test_packages/test_${{env.PACKAGE}}.py"
38 changes: 21 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# ModelConverter - Compilation Library

[![License](https://img.shields.io/badge/License-Apache_2.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
[![PyPI](https://img.shields.io/pypi/v/luxonis-ml?label=pypi%20package)](https://pypi.org/project/modelconv/)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/luxonis-ml)](https://pypi.org/project/modelconv/)
[![PyPI](https://img.shields.io/pypi/v/modelconv?label=pypi%20package)](https://pypi.org/project/modelconv/)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/modelconv)](https://pypi.org/project/modelconv/)

[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
[![Docformatter](https://img.shields.io/badge/%20formatter-docformatter-fedcba.svg)](https://github.com/PyCQA/docformatter)
Expand All @@ -24,6 +24,8 @@ Convert your **ONNX** models to a format compatible with any generation of Luxon
- [MLOps - Compilation Library](#mlops---compilation-library)
- [Table of Contents](#table-of-contents)
- [Installation](#installation)
- [Before You Begin](#before-you-begin)
- [Instructions](#instructions)
- [GPU Support](#gpu-support)
- [Running ModelConverter](#running-modelconverter)
- [Sharing Files](#sharing-files)
Expand All @@ -38,6 +40,13 @@ Convert your **ONNX** models to a format compatible with any generation of Luxon

## Installation

### System Requirements

`ModelConverter` requires `docker` to be installed on your system.
It is recommended to use Ubuntu OS for the best compatibility.
On Windows or MacOS, it is recommended to install `docker` using the [Docker Desktop](https://www.docker.com/products/docker-desktop).
Otherwise follow the installation instructions for your OS from the [official website](https://docs.docker.com/engine/install/).

### Before You Begin

`ModelConverter` is in an experimental public beta stage. Some parts might change in the future.
Expand All @@ -57,24 +66,19 @@ Requires `snpe.zip` archive to be present in `docker/extra_packages`. You can do

Requires `hailo_ai_sw_suite_2024-04:1` docker image to be present on the system. You can download the image from the [Hailo website](https://developer.hailo.ai/developer-zone/sw-downloads/).

### Building the Image
### Instructions

1. Ensure `docker` is installed on your machine. If not, refer to the installation guide [here](https://docs.docker.com/engine/install/).
1. Build the docker image:

1. Install the modelconverter package. You can install it from PyPI using the following command:
```bash
docker build -f docker/<package>/Dockerfile.public -t luxonis/modelconverter-<package>:latest .
```
1. For easier use, you can install the ModelConverter CLI. You can install it from PyPI using the following command:

```bash
pip install modelconv

```

1. Build the docker image:

```bash
docker build -f docker/<package>/Dockerfile.public -t luxonis/modelconverter-<package>:latest .
```

This will also install the `modelconverter` CLI. For usage instructions, see `modelconverter --help`.
For usage instructions, see `modelconverter --help`.

### GPU Support

Expand Down Expand Up @@ -105,7 +109,7 @@ shared_with_container/
│ └── <models will be downloaded here>
└── outputs/
└── <output_dir_name>
└── <output_dir>
├── resnet18.onnx
├── resnet18.dlc
├── logs.txt
Expand All @@ -118,7 +122,7 @@ While adhering to this structure is not mandatory as long as the files are visib

The converter first searches for files exactly at the provided path. If not found, it searches relative to `/app/shared_with_container/`.

The `output_dir_name` can be specified in the config file. If such a directory already exists, the `output_dir_name` will be appended with the current date and time. If not specified, the `output_dir_name` will be autogenerated in the following format: `<model_name>_to_<target>_<date>_<time>`.
The `output_dir` can be specified using the `--output-dir` CLI argument. If such a directory already exists, the `output_dir_name` will be appended with the current date and time. If not specified, the `output_dir_name` will be autogenerated in the following format: `<model_name>_to_<target>_<date>_<time>`.

### Usage

Expand Down Expand Up @@ -276,7 +280,7 @@ To run the inference, use:
```bash
modelconverter infer rvc4 \
--model_path <path_to_model.dlc> \
--dest <dest> \
--output-dir <output_dir_name> \
--input_path <input_path>
--path <path_to_config.yaml>
```
Expand Down
70 changes: 45 additions & 25 deletions modelconverter/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ class Format(str, Enum):
),
]
ModelPathOption: TypeAlias = Annotated[
str, typer.Option(help="Path to or url of the model file.")
str, typer.Option(help="A URL or a path to the model file.")
]

DockerOption: TypeAlias = Annotated[
Expand All @@ -112,16 +112,24 @@ class Format(str, Enum):
typer.Option(help="Use GPU for conversion. Only relevant for HAILO."),
]

OutputDirOption: TypeAlias = Annotated[
Optional[str],
typer.Option(
..., "--output-dir", "-o", help="Name of the output directory."
),
]


def get_output_dir_name(target: Target, config: Config) -> Path:
def get_output_dir_name(
target: Target, name: str, output_dir: Optional[str]
) -> Path:
date = datetime.now().strftime("%Y_%m_%d_%H_%M_%S")
if config.output_dir_name is not None:
output_dir_name = config.output_dir_name
if (OUTPUTS_DIR / output_dir_name).exists():
shutil.rmtree(OUTPUTS_DIR / output_dir_name)
if output_dir is not None:
if (OUTPUTS_DIR / output_dir).exists():
shutil.rmtree(OUTPUTS_DIR / output_dir)
else:
output_dir_name = f"{config.name}_to_{target.name.lower()}_{date}"
return OUTPUTS_DIR / output_dir_name
output_dir = f"{name}_to_{target.name.lower()}_{date}"
return OUTPUTS_DIR / output_dir


def get_configs(
Expand Down Expand Up @@ -205,19 +213,22 @@ def infer(
Path,
typer.Option(
...,
"--input-path",
"-i",
help="Path to the directory with data for inference."
"The directory must contain one subdirectory per input, named the same as the input."
"Inference data must be provided in the NPY format.",
),
],
path: PathOption,
dest: Annotated[
Path, typer.Option(..., help="Path to the output directory.")
],
output_dir: OutputDirOption = None,
stage: Annotated[
Optional[str],
typer.Option(
help="Name of the stage to run. Only needed for multistage configs."
...,
"--stage",
"-s",
help="Name of the stage to run. Only needed for multistage configs.",
),
] = None,
dev: DevOption = False,
Expand All @@ -240,8 +251,11 @@ def infer(
try:
mult_cfg, _, _ = get_configs(path, opts)
cfg = mult_cfg.get_stage_config(stage)
output_path = get_output_dir_name(
target, mult_cfg.name, output_dir
)
Inferer = get_inferer(target)
Inferer.from_config(model_path, input_path, dest, cfg).run()
Inferer.from_config(model_path, input_path, output_path, cfg).run()
except Exception:
logger.exception("Encountered an unexpected error!")
exit(2)
Expand All @@ -253,11 +267,11 @@ def infer(
str(model_path),
"--input-path",
str(input_path),
"--dest",
str(dest),
"--path",
str(path),
]
if output_dir is not None:
args.extend(["--output-dir", output_dir])
if opts is not None:
args.extend(opts)
docker_exec(target.value, *args, tag=tag, use_gpu=gpu)
Expand Down Expand Up @@ -340,6 +354,7 @@ def benchmark(
def convert(
target: TargetArgument,
path: PathOption = None,
output_dir: OutputDirOption = None,
dev: DevOption = False,
to: FormatOption = Format.NATIVE,
gpu: GPUOption = True,
Expand Down Expand Up @@ -391,24 +406,24 @@ def convert(
if archive_preprocess:
cfg, preprocessing = extract_preprocessing(cfg)

output_dir = get_output_dir_name(target, cfg)
output_dir.mkdir(parents=True, exist_ok=True)
output_path = get_output_dir_name(target, cfg.name, output_dir)
output_path.mkdir(parents=True, exist_ok=True)
reset_logging()
setup_logging(
file=str(output_dir / "modelconverter.log"), use_rich=True
file=str(output_path / "modelconverter.log"), use_rich=True
)
if is_multistage:
from modelconverter.packages.multistage_exporter import (
MultiStageExporter,
)

exporter = MultiStageExporter(
target=target, config=cfg, output_dir=output_dir
target=target, config=cfg, output_dir=output_path
)
else:
exporter = get_exporter(target)(
config=next(iter(cfg.stages.values())),
output_dir=output_dir,
output_dir=output_path,
)

out_models = exporter.run()
Expand All @@ -419,9 +434,12 @@ def convert(

logger.info("Converting to NN archive")
assert main_stage is not None
if len(out_models) > 1:
model_name = f"{main_stage}{out_models[0].suffix}"
else:
model_name = out_models[0].name
nn_archive = modelconverter_config_to_nn(
cfg,
target,
Path(model_name),
archive_cfg,
preprocessing,
main_stage,
Expand All @@ -430,13 +448,13 @@ def convert(
else exporter.exporters[main_stage].inference_model_path,
)
generator = ArchiveGenerator(
archive_name=cfg.name,
save_path=str(output_dir),
archive_name=f"{cfg.name}.{target.value.lower()}",
save_path=str(output_path),
cfg_dict=nn_archive.model_dump(),
executables_paths=[
str(out_model) for out_model in out_models
]
+ [str(output_dir / "buildinfo.json")],
+ [str(output_path / "buildinfo.json")],
)
out_models = [generator.make_archive()]
logger.info(f"Model exported to {out_models[0]}")
Expand Down Expand Up @@ -479,6 +497,8 @@ def convert(
]
if main_stage is not None:
args.extend(["--main-stage", main_stage])
if output_dir is not None:
args.extend(["--output-dir", output_dir])
if path is not None:
args.extend(["--path", path])
if opts is not None:
Expand Down
12 changes: 6 additions & 6 deletions modelconverter/packages/rvc2/exporter.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@

logger = getLogger(__name__)

COMPILE_TOOL: Final[
str
] = f'{env["INTEL_OPENVINO_DIR"]}/tools/compile_tool/compile_tool'
COMPILE_TOOL: Final[str] = (
f'{env["INTEL_OPENVINO_DIR"]}/tools/compile_tool/compile_tool'
)

DEFAULT_SUPER_SHAVES: Final[int] = 8

Expand Down Expand Up @@ -141,9 +141,9 @@ def export(self) -> Path:
if self.superblob:
return self.compile_superblob(args)

return self.compile(args)
return self.compile_blob(args)

def compile(self, args: list) -> Path:
def compile_blob(self, args: list) -> Path:
output_path = (
self.output_dir / f"{self.model_name}-{self.target.name.lower()}"
)
Expand Down Expand Up @@ -172,7 +172,7 @@ def compile_superblob(self, args: list) -> Path:

orig_args = args.copy()

default_blob_path = self.compile(
default_blob_path = self.compile_blob(
orig_args
+ [
"-o",
Expand Down
6 changes: 4 additions & 2 deletions modelconverter/packages/rvc4/benchmark.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,10 @@ def _adb_run(self, args, **kwargs) -> Tuple[int, str, str]:
f"stdout:\n{stdout}\n"
f"stderr:\n{stderr}\n"
)
return result.returncode, stdout.decode("utf-8"), stderr.decode(
"utf-8"
return (
result.returncode,
stdout.decode(errors="ignore"),
stderr.decode(errors="ignore"),
)

def shell(self, cmd: str) -> Tuple[int, str, str]:
Expand Down
Loading

0 comments on commit a154f78

Please sign in to comment.